Feb 08, 2026 2:21:38 AM org.apache.karaf.main.Main launch INFO: Installing and starting initial bundles Feb 08, 2026 2:21:39 AM org.apache.karaf.main.Main launch INFO: All initial bundles installed and set to start Feb 08, 2026 2:21:39 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Trying to lock /tmp/karaf-0.23.1-SNAPSHOT/lock Feb 08, 2026 2:21:39 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Lock acquired Feb 08, 2026 2:21:39 AM org.apache.karaf.main.Main$KarafLockCallback lockAcquired INFO: Lock acquired. Setting startlevel to 100 2026-02-08T02:21:39,689 | INFO | CM Configuration Updater (Update: pid=org.ops4j.pax.logging) | EventAdminConfigurationNotifier | 5 - org.ops4j.pax.logging.pax-logging-log4j2 - 2.3.0 | Logging configuration changed. (Event Admin service unavailable - no notification sent). 2026-02-08T02:21:41,200 | INFO | activator-1-thread-2 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Adding features: odl-infrautils-ready/[7.1.9,7.1.9],odl-openflowplugin-flow-services-rest/[0.21.2,0.21.2],odl-jolokia/[12.0.3,12.0.3],59b916ad-e1bd-4a8f-8673-b7516904ffdf/[0,0.0.0],odl-openflowplugin-app-bulk-o-matic/[0.21.2,0.21.2] 2026-02-08T02:21:41,362 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Changes to perform: 2026-02-08T02:21:41,363 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Region: root 2026-02-08T02:21:41,363 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Bundles to install: 2026-02-08T02:21:41,363 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.el/jakarta.el-api/3.0.3 2026-02-08T02:21:41,363 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.enterprise/cdi-api/2.0.SP1 2026-02-08T02:21:41,363 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.interceptor/javax.interceptor-api/1.2.2 2026-02-08T02:21:41,363 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.transaction/javax.transaction-api/1.2 2026-02-08T02:21:41,363 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.jasypt/1.9.3_1 2026-02-08T02:21:41,363 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.javax-inject/1_3 2026-02-08T02:21:41,363 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc/1.5.7 2026-02-08T02:21:41,364 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc-config/1.5.7 2026-02-08T02:21:41,364 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc-pool-common/1.5.7 2026-02-08T02:21:41,364 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.url/pax-url-wrap/2.6.17/jar/uber 2026-02-08T02:21:41,364 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.osgi/org.osgi.service.jdbc/1.1.0 2026-02-08T02:21:41,365 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Installing bundles: 2026-02-08T02:21:41,365 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.el/jakarta.el-api/3.0.3 2026-02-08T02:21:41,368 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.enterprise/cdi-api/2.0.SP1 2026-02-08T02:21:41,369 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.interceptor/javax.interceptor-api/1.2.2 2026-02-08T02:21:41,370 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.transaction/javax.transaction-api/1.2 2026-02-08T02:21:41,371 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.jasypt/1.9.3_1 2026-02-08T02:21:41,373 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.servicemix.bundles/org.apache.servicemix.bundles.javax-inject/1_3 2026-02-08T02:21:41,373 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc/1.5.7 2026-02-08T02:21:41,374 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc-config/1.5.7 2026-02-08T02:21:41,375 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.jdbc/pax-jdbc-pool-common/1.5.7 2026-02-08T02:21:41,376 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.url/pax-url-wrap/2.6.17/jar/uber 2026-02-08T02:21:41,379 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.osgi/org.osgi.service.jdbc/1.1.0 2026-02-08T02:21:41,407 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Starting bundles: 2026-02-08T02:21:41,408 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.url.wrap/2.6.17 2026-02-08T02:21:41,412 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.osgi.service.jdbc/1.1.0.202212101352 2026-02-08T02:21:41,413 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.el-api/3.0.3 2026-02-08T02:21:41,416 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2026-02-08T02:21:41,416 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.interceptor-api/1.2.2 2026-02-08T02:21:41,417 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.enterprise.cdi-api/2.0.0.SP1 2026-02-08T02:21:41,417 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.transaction-api/1.2.0 2026-02-08T02:21:41,417 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.jasypt/1.9.3.1 2026-02-08T02:21:41,417 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.pool.common/1.5.7 2026-02-08T02:21:41,417 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.config/1.5.7 2026-02-08T02:21:41,422 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc/1.5.7 2026-02-08T02:21:41,427 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Done. 2026-02-08T02:21:43,452 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Changes to perform: 2026-02-08T02:21:43,453 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Region: root 2026-02-08T02:21:43,453 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Bundles to uninstall: 2026-02-08T02:21:43,453 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2026-02-08T02:21:43,453 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Bundles to install: 2026-02-08T02:21:43,453 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.checkerframework/checker-qual/3.51.1 2026-02-08T02:21:43,453 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.code.gson/gson/2.13.2 2026-02-08T02:21:43,453 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.guava/guava/33.5.0-jre 2026-02-08T02:21:43,453 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.guava/failureaccess/1.0.3 2026-02-08T02:21:43,453 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.googlecode.json-simple/json-simple/1.1.1 2026-02-08T02:21:43,453 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.h2database/h2/2.3.232 2026-02-08T02:21:43,454 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.rabbitmq/amqp-client/5.26.0 2026-02-08T02:21:43,454 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.typesafe/config/1.4.5 2026-02-08T02:21:43,454 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.typesafe/ssl-config-core_3/0.6.1 2026-02-08T02:21:43,454 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.aeron/aeron-annotations/1.45.1 2026-02-08T02:21:43,454 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.aeron/aeron-client/1.45.1 2026-02-08T02:21:43,454 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.aeron/aeron-driver/1.45.1 2026-02-08T02:21:43,454 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-core/4.2.37 2026-02-08T02:21:43,454 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-graphite/4.2.37 2026-02-08T02:21:43,454 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-healthchecks/4.2.37 2026-02-08T02:21:43,454 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-jmx/4.2.37 2026-02-08T02:21:43,454 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-jvm/4.2.37 2026-02-08T02:21:43,454 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-buffer/4.2.7.Final 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-base/4.2.7.Final 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-compression/4.2.7.Final 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-http/4.2.7.Final 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-http2/4.2.7.Final 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-common/4.2.7.Final 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-handler/4.2.7.Final 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-resolver/4.2.7.Final 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport/4.2.7.Final 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-classes-epoll/4.2.7.Final 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-native-epoll/4.2.7.Final/jar/linux-x86_64 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-native-unix-common/4.2.7.Final 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.activation/jakarta.activation-api/1.2.2 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.annotation/jakarta.annotation-api/1.3.5 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.servlet/jakarta.servlet-api/4.0.4 2026-02-08T02:21:43,455 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.validation/jakarta.validation-api/2.0.2 2026-02-08T02:21:43,456 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.ws.rs/jakarta.ws.rs-api/2.1.6 2026-02-08T02:21:43,456 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.javassist/javassist/3.30.2-GA 2026-02-08T02:21:43,456 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.servlet/javax.servlet-api/3.1.0 2026-02-08T02:21:43,456 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.websocket/jakarta.websocket-api/1.1.2 2026-02-08T02:21:43,456 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.odlparent/karaf.branding/14.1.6 2026-02-08T02:21:43,456 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.lz4/lz4-java/1.8.0 2026-02-08T02:21:43,456 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:net.bytebuddy/byte-buddy/1.17.8 2026-02-08T02:21:43,456 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.agrona/agrona/1.22.0 2026-02-08T02:21:43,456 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.antlr/antlr4-runtime/4.13.2 2026-02-08T02:21:43,456 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.api/1.0.1 2026-02-08T02:21:43,456 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.cm/1.3.2 2026-02-08T02:21:43,456 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.core/1.10.3 2026-02-08T02:21:43,456 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.api/1.1.5 2026-02-08T02:21:43,456 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.api/1.2.0 2026-02-08T02:21:43,457 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.core/1.2.0 2026-02-08T02:21:43,457 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.core/1.1.8 2026-02-08T02:21:43,457 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.whiteboard/1.2.0 2026-02-08T02:21:43,457 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.proxy/org.apache.aries.proxy/1.1.14 2026-02-08T02:21:43,457 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.quiesce/org.apache.aries.quiesce.api/1.0.0 2026-02-08T02:21:43,457 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries/org.apache.aries.util/1.1.3 2026-02-08T02:21:43,457 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-collections/commons-collections/3.2.2 2026-02-08T02:21:43,457 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-beanutils/commons-beanutils/1.11.0 2026-02-08T02:21:43,457 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-codec/commons-codec/1.19.0 2026-02-08T02:21:43,457 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.commons/commons-lang3/3.19.0 2026-02-08T02:21:43,457 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.commons/commons-text/1.14.0 2026-02-08T02:21:43,457 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.felix/org.apache.felix.scr/2.2.6 2026-02-08T02:21:43,457 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.geronimo.specs/geronimo-atinject_1.0_spec/1.2 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.blueprintstate/4.4.8 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.core/4.4.8 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.config/org.apache.karaf.config.command/4.4.8 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.blueprint/4.4.8 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.features/4.4.8 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.kar/4.4.8 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.wrap/4.4.8 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.boot/4.4.8 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.core/4.4.8 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.features/org.apache.karaf.features.command/4.4.8 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.http/org.apache.karaf.http.core/4.4.8 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.instance/org.apache.karaf.instance.core/4.4.8 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.command/4.4.8 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.config/4.4.8 2026-02-08T02:21:43,458 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.modules/4.4.8 2026-02-08T02:21:43,459 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jdbc/org.apache.karaf.jdbc.core/4.4.8 2026-02-08T02:21:43,459 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.kar/org.apache.karaf.kar.core/4.4.8 2026-02-08T02:21:43,459 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.log/org.apache.karaf.log.core/4.4.8 2026-02-08T02:21:43,459 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.management/org.apache.karaf.management.server/4.4.8 2026-02-08T02:21:43,459 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.package/org.apache.karaf.package.core/4.4.8 2026-02-08T02:21:43,459 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.management/4.4.8 2026-02-08T02:21:43,459 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.state/4.4.8 2026-02-08T02:21:43,459 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.service/org.apache.karaf.service.core/4.4.8 2026-02-08T02:21:43,459 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.commands/4.4.8 2026-02-08T02:21:43,459 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.console/4.4.8 2026-02-08T02:21:43,459 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.core/4.4.8 2026-02-08T02:21:43,459 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.ssh/4.4.8 2026-02-08T02:21:43,459 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.table/4.4.8 2026-02-08T02:21:43,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.system/org.apache.karaf.system.core/4.4.8 2026-02-08T02:21:43,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.web/org.apache.karaf.web.core/4.4.8 2026-02-08T02:21:43,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-osgi/2.15.0 2026-02-08T02:21:43,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-scp/2.15.0 2026-02-08T02:21:43,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-sftp/2.15.0 2026-02-08T02:21:43,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jdt/ecj/3.26.0 2026-02-08T02:21:43,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-client/9.4.57.v20241219 2026-02-08T02:21:43,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-continuation/9.4.57.v20241219 2026-02-08T02:21:43,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-http/9.4.57.v20241219 2026-02-08T02:21:43,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-io/9.4.57.v20241219 2026-02-08T02:21:43,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-jaas/9.4.57.v20241219 2026-02-08T02:21:43,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-jmx/9.4.57.v20241219 2026-02-08T02:21:43,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-security/9.4.57.v20241219 2026-02-08T02:21:43,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-server/9.4.57.v20241219 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-servlet/9.4.57.v20241219 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-servlets/9.4.57.v20241219 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-util/9.4.57.v20241219 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-util-ajax/9.4.57.v20241219 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-xml/9.4.57.v20241219 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-api/2.6.1 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2.external/aopalliance-repackaged/2.6.1 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-locator/2.6.1 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/osgi-resource-locator/1.0.3 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-utils/2.6.1 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.containers/jersey-container-servlet/2.47 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.containers/jersey-container-servlet-core/2.47 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-client/2.47 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-common/2.47 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-server/2.47 2026-02-08T02:21:43,461 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.inject/jersey-hk2/2.47 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.media/jersey-media-sse/2.47 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jline/jline/3.21.0 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jolokia/jolokia-osgi/1.7.2 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jspecify/jspecify/1.0.0 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm/9.8 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-commons/9.8 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-tree/9.8 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-analysis/9.8 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-util/9.8 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-authn-api/0.22.3 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-cert/0.22.3 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-encrypt-service/0.22.3 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-encrypt-service-impl/0.22.3 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-filterchain/0.22.3 2026-02-08T02:21:43,462 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-idm-store-h2/0.22.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-jetty-auth-log-filter/0.22.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-password-service-api/0.22.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-password-service-impl/0.22.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/repackaged-shiro/0.22.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-shiro/0.22.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-shiro-api/0.22.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-tokenauthrealm/0.22.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/web-api/0.22.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/web-osgi-impl/0.22.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/servlet-api/0.22.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/servlet-jersey2/0.22.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/blueprint/12.0.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-access-api/12.0.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-access-client/12.0.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-dom-api/12.0.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-mgmt-api/12.0.3 2026-02-08T02:21:43,463 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/eos-dom-akka/12.0.3 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-api/12.0.3 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-journal/12.0.3 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-spi/12.0.3 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/repackaged-pekko/12.0.3 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-akka-raft/12.0.3 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-cluster-admin-api/12.0.3 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-cluster-admin-impl/12.0.3 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-clustering-commons/12.0.3 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-common-util/12.0.3 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-distributed-datastore/12.0.3 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-remoterpc-connector/12.0.3 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/scala3-library/12.0.3 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf/ietf-type-util/1.0.2 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/iana-crypt-hash/1.0.2 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/iana-ssh-encryption-algs/1.0.2 2026-02-08T02:21:43,464 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/iana-ssh-key-exchange-algs/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/iana-ssh-mac-algs/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/iana-ssh-public-key-algs/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/iana-tls-cipher-suite-algs/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc6241/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc6243/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc6470/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc6991-ietf-inet-types/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc6991-ietf-yang-types/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc7407-ietf-x509-cert-to-name/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc7952/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8040-ietf-restconf/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8040-ietf-restconf-monitoring/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8072/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8341/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8342-ietf-datastores/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8342-ietf-origin/1.0.2 2026-02-08T02:21:43,465 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8343/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8344/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8525/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8526/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8528/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8529/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8639/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8650/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9640/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9641/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9642/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9643-ietf-tcp-client/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9643-ietf-tcp-common/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9643-ietf-tcp-server/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9644-ietf-ssh-client/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9644-ietf-ssh-common/1.0.2 2026-02-08T02:21:43,466 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9644-ietf-ssh-server/1.0.2 2026-02-08T02:21:43,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9645-ietf-tls-client/1.0.2 2026-02-08T02:21:43,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9645-ietf-tls-common/1.0.2 2026-02-08T02:21:43,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9645-ietf-tls-server/1.0.2 2026-02-08T02:21:43,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-api/7.1.9 2026-02-08T02:21:43,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-impl/7.1.9 2026-02-08T02:21:43,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-shell/7.1.9 2026-02-08T02:21:43,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/ready-api/7.1.9 2026-02-08T02:21:43,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/ready-impl/7.1.9 2026-02-08T02:21:43,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/infrautils-util/7.1.9 2026-02-08T02:21:43,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-dom-adapter/15.0.2 2026-02-08T02:21:43,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-util/15.0.2 2026-02-08T02:21:43,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-adapter/15.0.2 2026-02-08T02:21:43,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-api/15.0.2 2026-02-08T02:21:43,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-spi/15.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-common-api/15.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-api/15.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-broker/15.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-schema-osgi/15.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-spi/15.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-api/15.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-common-api/15.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-dom-api/15.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-singleton-api/15.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-singleton-impl/15.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/general-entity/15.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/ietf-topology/2013.10.21.27.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/opendaylight-l2-types/2013.08.27.27.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/yang-ext/2013.09.07.27.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/databind/10.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-dom-api/10.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/keystore-api/10.0.2 2026-02-08T02:21:43,468 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/keystore-none/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/draft-ietf-restconf-server/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/rfc5277/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/sal-remote/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-api/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-common-mdsal/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/odl-device-notification/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-api/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-mdsal-spi/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-nb/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-api/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-jaxrs/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-mdsal/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-spi/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/rfc8639-impl/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/sal-remote-impl/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/shaded-sshd/10.0.2 2026-02-08T02:21:43,469 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-api/10.0.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-crypto/10.0.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-http/10.0.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-ssh/10.0.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-tcp/10.0.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-tls/10.0.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/truststore-api/10.0.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/truststore-none/10.0.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/yanglib-mdsal-writer/10.0.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.odlparent/bundles-diag/14.1.6 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin/0.21.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-api/0.21.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-api/0.21.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-impl/0.21.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/bulk-o-matic/0.21.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/device-ownership-service/0.21.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/forwardingrules-manager/0.21.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/lldp-speaker/0.21.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/of-switch-config-pusher/0.21.2 2026-02-08T02:21:43,470 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/reconciliation-framework/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/topology-lldp-discovery/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/topology-manager/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-blueprint-config/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-common/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-api/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-onf/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-impl/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.libraries/liblldp/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-base/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-service/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-statistics/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-inventory/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-topology/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-blueprint-config/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-api/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-impl/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-spi/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-util/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-api/0.21.2 2026-02-08T02:21:43,471 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-impl/0.21.2 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-shell/0.21.2 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-api/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-dynamic/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-osgi/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-spi/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-generator/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-loader/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-model/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-reflect/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-api/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-osgi/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-spi/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-spec/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/codegen-extensions/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/concepts/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/odlext-model-api/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/odlext-parser-support/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/openconfig-model-api/14.0.20 2026-02-08T02:21:43,472 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/openconfig-parser-support/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6241-model-api/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6241-parser-support/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6536-model-api/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6536-parser-support/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6643-model-api/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6643-parser-support/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc7952-model-api/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc7952-parser-support/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8040-model-api/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8040-parser-support/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8528-model-api/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8528-parser-support/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8639-model-api/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8639-parser-support/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8819-model-api/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8819-parser-support/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/util/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-common/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-common-netty/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-api/14.0.20 2026-02-08T02:21:43,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-binfmt/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-gson/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-xml/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-impl/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-spi/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-transform/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-api/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-ri/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-spi/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-util/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-ir/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-api/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-export/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-ri/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-spi/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-util/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-api/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-impl/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-reactor/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-rfc7950/14.0.20 2026-02-08T02:21:43,474 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-spi/14.0.20 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-api/14.0.20 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-fs/14.0.20 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-spi/14.0.20 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-xpath-api/14.0.20 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-xpath-impl/14.0.20 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.url/pax-url-war/2.6.17/jar/uber 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-api/8.0.33 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-compatibility-el2/8.0.33 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-compatibility-servlet31/8.0.33 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-extender-war/8.0.33 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-extender-whiteboard/8.0.33 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-jetty/8.0.33 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-jsp/8.0.33 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-runtime/8.0.33 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-spi/8.0.33 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-tomcat-common/8.0.33 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-websocket/8.0.33 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.osgi/org.osgi.service.component/1.5.1 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.owasp.encoder/encoder/1.3.1 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.reactivestreams/reactive-streams/1.0.4 2026-02-08T02:21:43,475 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.codehaus.woodstox/stax2-api/4.2.2 2026-02-08T02:21:43,476 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:tech.pantheon.triemap/triemap/1.3.2 2026-02-08T02:21:43,476 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap:mvn:org.lmdbjava/lmdbjava/0.9.1 2026-02-08T02:21:43,476 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Stopping bundles: 2026-02-08T02:21:43,476 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.pool.common/1.5.7 2026-02-08T02:21:43,477 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.jasypt/1.9.3.1 2026-02-08T02:21:43,478 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2026-02-08T02:21:43,478 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.transaction-api/1.2.0 2026-02-08T02:21:43,478 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.enterprise.cdi-api/2.0.0.SP1 2026-02-08T02:21:43,478 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.el-api/3.0.3 2026-02-08T02:21:43,478 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.config/1.5.7 2026-02-08T02:21:43,479 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Uninstalling bundles: 2026-02-08T02:21:43,479 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 2026-02-08T02:21:43,480 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Installing bundles: 2026-02-08T02:21:43,481 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.checkerframework/checker-qual/3.51.1 2026-02-08T02:21:43,482 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.code.gson/gson/2.13.2 2026-02-08T02:21:43,483 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.guava/guava/33.5.0-jre 2026-02-08T02:21:43,488 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.google.guava/failureaccess/1.0.3 2026-02-08T02:21:43,488 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.googlecode.json-simple/json-simple/1.1.1 2026-02-08T02:21:43,489 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.h2database/h2/2.3.232 2026-02-08T02:21:43,493 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.rabbitmq/amqp-client/5.26.0 2026-02-08T02:21:43,495 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.typesafe/config/1.4.5 2026-02-08T02:21:43,496 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:com.typesafe/ssl-config-core_3/0.6.1 2026-02-08T02:21:43,498 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.aeron/aeron-annotations/1.45.1 2026-02-08T02:21:43,498 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.aeron/aeron-client/1.45.1 2026-02-08T02:21:43,499 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.aeron/aeron-driver/1.45.1 2026-02-08T02:21:43,501 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-core/4.2.37 2026-02-08T02:21:43,501 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-graphite/4.2.37 2026-02-08T02:21:43,502 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-healthchecks/4.2.37 2026-02-08T02:21:43,503 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-jmx/4.2.37 2026-02-08T02:21:43,503 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.dropwizard.metrics/metrics-jvm/4.2.37 2026-02-08T02:21:43,504 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-buffer/4.2.7.Final 2026-02-08T02:21:43,505 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-base/4.2.7.Final 2026-02-08T02:21:43,506 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-compression/4.2.7.Final 2026-02-08T02:21:43,507 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-http/4.2.7.Final 2026-02-08T02:21:43,509 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-codec-http2/4.2.7.Final 2026-02-08T02:21:43,510 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-common/4.2.7.Final 2026-02-08T02:21:43,512 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-handler/4.2.7.Final 2026-02-08T02:21:43,514 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-resolver/4.2.7.Final 2026-02-08T02:21:43,515 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport/4.2.7.Final 2026-02-08T02:21:43,516 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-classes-epoll/4.2.7.Final 2026-02-08T02:21:43,518 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-native-epoll/4.2.7.Final/jar/linux-x86_64 2026-02-08T02:21:43,519 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:io.netty/netty-transport-native-unix-common/4.2.7.Final 2026-02-08T02:21:43,520 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.activation/jakarta.activation-api/1.2.2 2026-02-08T02:21:43,521 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.annotation/jakarta.annotation-api/1.3.5 2026-02-08T02:21:43,521 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.servlet/jakarta.servlet-api/4.0.4 2026-02-08T02:21:43,522 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.validation/jakarta.validation-api/2.0.2 2026-02-08T02:21:43,523 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.ws.rs/jakarta.ws.rs-api/2.1.6 2026-02-08T02:21:43,524 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.javassist/javassist/3.30.2-GA 2026-02-08T02:21:43,526 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:javax.servlet/javax.servlet-api/3.1.0 2026-02-08T02:21:43,527 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:jakarta.websocket/jakarta.websocket-api/1.1.2 2026-02-08T02:21:43,527 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.odlparent/karaf.branding/14.1.6 2026-02-08T02:21:43,528 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.lz4/lz4-java/1.8.0 2026-02-08T02:21:43,529 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:net.bytebuddy/byte-buddy/1.17.8 2026-02-08T02:21:43,541 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.agrona/agrona/1.22.0 2026-02-08T02:21:43,542 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.antlr/antlr4-runtime/4.13.2 2026-02-08T02:21:43,544 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.api/1.0.1 2026-02-08T02:21:43,544 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.cm/1.3.2 2026-02-08T02:21:43,545 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.blueprint/org.apache.aries.blueprint.core/1.10.3 2026-02-08T02:21:43,547 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.api/1.1.5 2026-02-08T02:21:43,547 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.api/1.2.0 2026-02-08T02:21:43,548 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.blueprint.core/1.2.0 2026-02-08T02:21:43,549 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.core/1.1.8 2026-02-08T02:21:43,550 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.jmx/org.apache.aries.jmx.whiteboard/1.2.0 2026-02-08T02:21:43,551 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.proxy/org.apache.aries.proxy/1.1.14 2026-02-08T02:21:43,552 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries.quiesce/org.apache.aries.quiesce.api/1.0.0 2026-02-08T02:21:43,570 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.aries/org.apache.aries.util/1.1.3 2026-02-08T02:21:43,572 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-collections/commons-collections/3.2.2 2026-02-08T02:21:43,574 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-beanutils/commons-beanutils/1.11.0 2026-02-08T02:21:43,576 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:commons-codec/commons-codec/1.19.0 2026-02-08T02:21:43,578 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.commons/commons-lang3/3.19.0 2026-02-08T02:21:43,580 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.commons/commons-text/1.14.0 2026-02-08T02:21:43,581 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.felix/org.apache.felix.scr/2.2.6 2026-02-08T02:21:43,583 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.geronimo.specs/geronimo-atinject_1.0_spec/1.2 2026-02-08T02:21:43,584 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.blueprintstate/4.4.8 2026-02-08T02:21:43,585 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.bundle/org.apache.karaf.bundle.core/4.4.8 2026-02-08T02:21:43,586 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.config/org.apache.karaf.config.command/4.4.8 2026-02-08T02:21:43,587 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.blueprint/4.4.8 2026-02-08T02:21:43,588 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.features/4.4.8 2026-02-08T02:21:43,589 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.kar/4.4.8 2026-02-08T02:21:43,589 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.deployer/org.apache.karaf.deployer.wrap/4.4.8 2026-02-08T02:21:43,590 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.boot/4.4.8 2026-02-08T02:21:43,591 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.diagnostic/org.apache.karaf.diagnostic.core/4.4.8 2026-02-08T02:21:43,592 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.features/org.apache.karaf.features.command/4.4.8 2026-02-08T02:21:43,592 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.http/org.apache.karaf.http.core/4.4.8 2026-02-08T02:21:43,595 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.instance/org.apache.karaf.instance.core/4.4.8 2026-02-08T02:21:43,596 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.command/4.4.8 2026-02-08T02:21:43,597 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.config/4.4.8 2026-02-08T02:21:43,597 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jaas/org.apache.karaf.jaas.modules/4.4.8 2026-02-08T02:21:43,600 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.jdbc/org.apache.karaf.jdbc.core/4.4.8 2026-02-08T02:21:43,601 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.kar/org.apache.karaf.kar.core/4.4.8 2026-02-08T02:21:43,602 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.log/org.apache.karaf.log.core/4.4.8 2026-02-08T02:21:43,602 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.management/org.apache.karaf.management.server/4.4.8 2026-02-08T02:21:43,603 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.package/org.apache.karaf.package.core/4.4.8 2026-02-08T02:21:43,604 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.management/4.4.8 2026-02-08T02:21:43,605 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.scr/org.apache.karaf.scr.state/4.4.8 2026-02-08T02:21:43,605 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.service/org.apache.karaf.service.core/4.4.8 2026-02-08T02:21:43,606 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.commands/4.4.8 2026-02-08T02:21:43,607 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.console/4.4.8 2026-02-08T02:21:43,609 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.core/4.4.8 2026-02-08T02:21:43,611 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.ssh/4.4.8 2026-02-08T02:21:43,612 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.shell/org.apache.karaf.shell.table/4.4.8 2026-02-08T02:21:43,612 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.system/org.apache.karaf.system.core/4.4.8 2026-02-08T02:21:43,613 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.karaf.web/org.apache.karaf.web.core/4.4.8 2026-02-08T02:21:43,614 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-osgi/2.15.0 2026-02-08T02:21:43,618 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-scp/2.15.0 2026-02-08T02:21:43,620 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.apache.sshd/sshd-sftp/2.15.0 2026-02-08T02:21:43,621 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jdt/ecj/3.26.0 2026-02-08T02:21:43,626 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-client/9.4.57.v20241219 2026-02-08T02:21:43,627 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-continuation/9.4.57.v20241219 2026-02-08T02:21:43,628 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-http/9.4.57.v20241219 2026-02-08T02:21:43,629 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-io/9.4.57.v20241219 2026-02-08T02:21:43,630 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-jaas/9.4.57.v20241219 2026-02-08T02:21:43,630 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-jmx/9.4.57.v20241219 2026-02-08T02:21:43,631 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-security/9.4.57.v20241219 2026-02-08T02:21:43,632 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-server/9.4.57.v20241219 2026-02-08T02:21:43,634 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-servlet/9.4.57.v20241219 2026-02-08T02:21:43,635 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-servlets/9.4.57.v20241219 2026-02-08T02:21:43,636 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-util/9.4.57.v20241219 2026-02-08T02:21:43,637 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-util-ajax/9.4.57.v20241219 2026-02-08T02:21:43,638 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.eclipse.jetty/jetty-xml/9.4.57.v20241219 2026-02-08T02:21:43,639 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-api/2.6.1 2026-02-08T02:21:43,639 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2.external/aopalliance-repackaged/2.6.1 2026-02-08T02:21:43,640 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-locator/2.6.1 2026-02-08T02:21:43,641 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/osgi-resource-locator/1.0.3 2026-02-08T02:21:43,642 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.hk2/hk2-utils/2.6.1 2026-02-08T02:21:43,643 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.containers/jersey-container-servlet/2.47 2026-02-08T02:21:43,644 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.containers/jersey-container-servlet-core/2.47 2026-02-08T02:21:43,645 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-client/2.47 2026-02-08T02:21:43,646 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-common/2.47 2026-02-08T02:21:43,649 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.core/jersey-server/2.47 2026-02-08T02:21:43,651 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.inject/jersey-hk2/2.47 2026-02-08T02:21:43,652 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.glassfish.jersey.media/jersey-media-sse/2.47 2026-02-08T02:21:43,653 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jline/jline/3.21.0 2026-02-08T02:21:43,655 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jolokia/jolokia-osgi/1.7.2 2026-02-08T02:21:43,656 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.jspecify/jspecify/1.0.0 2026-02-08T02:21:43,657 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm/9.8 2026-02-08T02:21:43,658 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-commons/9.8 2026-02-08T02:21:43,659 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-tree/9.8 2026-02-08T02:21:43,659 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-analysis/9.8 2026-02-08T02:21:43,660 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ow2.asm/asm-util/9.8 2026-02-08T02:21:43,661 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-authn-api/0.22.3 2026-02-08T02:21:43,661 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-cert/0.22.3 2026-02-08T02:21:43,662 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-encrypt-service/0.22.3 2026-02-08T02:21:43,663 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-encrypt-service-impl/0.22.3 2026-02-08T02:21:43,664 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-filterchain/0.22.3 2026-02-08T02:21:43,665 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-idm-store-h2/0.22.3 2026-02-08T02:21:43,665 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-jetty-auth-log-filter/0.22.3 2026-02-08T02:21:43,666 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-password-service-api/0.22.3 2026-02-08T02:21:43,667 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-password-service-impl/0.22.3 2026-02-08T02:21:43,667 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/repackaged-shiro/0.22.3 2026-02-08T02:21:43,670 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-shiro/0.22.3 2026-02-08T02:21:43,671 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-shiro-api/0.22.3 2026-02-08T02:21:43,673 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa/aaa-tokenauthrealm/0.22.3 2026-02-08T02:21:43,673 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/web-api/0.22.3 2026-02-08T02:21:43,674 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/web-osgi-impl/0.22.3 2026-02-08T02:21:43,675 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/servlet-api/0.22.3 2026-02-08T02:21:43,675 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.aaa.web/servlet-jersey2/0.22.3 2026-02-08T02:21:43,676 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/blueprint/12.0.3 2026-02-08T02:21:43,677 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-access-api/12.0.3 2026-02-08T02:21:43,678 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-access-client/12.0.3 2026-02-08T02:21:43,679 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-dom-api/12.0.3 2026-02-08T02:21:43,679 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/cds-mgmt-api/12.0.3 2026-02-08T02:21:43,680 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/eos-dom-akka/12.0.3 2026-02-08T02:21:43,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-api/12.0.3 2026-02-08T02:21:43,682 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-journal/12.0.3 2026-02-08T02:21:43,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/raft-spi/12.0.3 2026-02-08T02:21:43,684 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/repackaged-pekko/12.0.3 2026-02-08T02:21:43,711 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-akka-raft/12.0.3 2026-02-08T02:21:43,712 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-cluster-admin-api/12.0.3 2026-02-08T02:21:43,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-cluster-admin-impl/12.0.3 2026-02-08T02:21:43,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-clustering-commons/12.0.3 2026-02-08T02:21:43,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-common-util/12.0.3 2026-02-08T02:21:43,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-distributed-datastore/12.0.3 2026-02-08T02:21:43,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/sal-remoterpc-connector/12.0.3 2026-02-08T02:21:43,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.controller/scala3-library/12.0.3 2026-02-08T02:21:43,729 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf/ietf-type-util/1.0.2 2026-02-08T02:21:43,729 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/iana-crypt-hash/1.0.2 2026-02-08T02:21:43,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/iana-ssh-encryption-algs/1.0.2 2026-02-08T02:21:43,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/iana-ssh-key-exchange-algs/1.0.2 2026-02-08T02:21:43,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/iana-ssh-mac-algs/1.0.2 2026-02-08T02:21:43,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/iana-ssh-public-key-algs/1.0.2 2026-02-08T02:21:43,733 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/iana-tls-cipher-suite-algs/1.0.2 2026-02-08T02:21:43,734 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc6241/1.0.2 2026-02-08T02:21:43,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc6243/1.0.2 2026-02-08T02:21:43,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc6470/1.0.2 2026-02-08T02:21:43,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc6991-ietf-inet-types/1.0.2 2026-02-08T02:21:43,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc6991-ietf-yang-types/1.0.2 2026-02-08T02:21:43,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc7407-ietf-x509-cert-to-name/1.0.2 2026-02-08T02:21:43,739 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc7952/1.0.2 2026-02-08T02:21:43,740 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8040-ietf-restconf/1.0.2 2026-02-08T02:21:43,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8040-ietf-restconf-monitoring/1.0.2 2026-02-08T02:21:43,742 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8072/1.0.2 2026-02-08T02:21:43,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8341/1.0.2 2026-02-08T02:21:43,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8342-ietf-datastores/1.0.2 2026-02-08T02:21:43,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8342-ietf-origin/1.0.2 2026-02-08T02:21:43,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8343/1.0.2 2026-02-08T02:21:43,746 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8344/1.0.2 2026-02-08T02:21:43,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8525/1.0.2 2026-02-08T02:21:43,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8526/1.0.2 2026-02-08T02:21:43,750 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8528/1.0.2 2026-02-08T02:21:43,751 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8529/1.0.2 2026-02-08T02:21:43,752 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8639/1.0.2 2026-02-08T02:21:43,754 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc8650/1.0.2 2026-02-08T02:21:43,754 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9640/1.0.2 2026-02-08T02:21:43,756 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9641/1.0.2 2026-02-08T02:21:43,757 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9642/1.0.2 2026-02-08T02:21:43,758 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9643-ietf-tcp-client/1.0.2 2026-02-08T02:21:43,760 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9643-ietf-tcp-common/1.0.2 2026-02-08T02:21:43,760 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9643-ietf-tcp-server/1.0.2 2026-02-08T02:21:43,762 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9644-ietf-ssh-client/1.0.2 2026-02-08T02:21:43,764 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9644-ietf-ssh-common/1.0.2 2026-02-08T02:21:43,766 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9644-ietf-ssh-server/1.0.2 2026-02-08T02:21:43,769 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9645-ietf-tls-client/1.0.2 2026-02-08T02:21:43,771 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9645-ietf-tls-common/1.0.2 2026-02-08T02:21:43,778 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.ietf.model/rfc9645-ietf-tls-server/1.0.2 2026-02-08T02:21:43,780 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-api/7.1.9 2026-02-08T02:21:43,780 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-impl/7.1.9 2026-02-08T02:21:43,781 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/diagstatus-shell/7.1.9 2026-02-08T02:21:43,782 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/ready-api/7.1.9 2026-02-08T02:21:43,782 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/ready-impl/7.1.9 2026-02-08T02:21:43,783 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.infrautils/infrautils-util/7.1.9 2026-02-08T02:21:43,784 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-dom-adapter/15.0.2 2026-02-08T02:21:43,785 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-util/15.0.2 2026-02-08T02:21:43,786 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-adapter/15.0.2 2026-02-08T02:21:43,787 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-api/15.0.2 2026-02-08T02:21:43,787 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-binding-spi/15.0.2 2026-02-08T02:21:43,788 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-common-api/15.0.2 2026-02-08T02:21:43,789 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-api/15.0.2 2026-02-08T02:21:43,790 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-broker/15.0.2 2026-02-08T02:21:43,791 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-schema-osgi/15.0.2 2026-02-08T02:21:43,791 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-dom-spi/15.0.2 2026-02-08T02:21:43,792 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-binding-api/15.0.2 2026-02-08T02:21:43,793 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-common-api/15.0.2 2026-02-08T02:21:43,793 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-eos-dom-api/15.0.2 2026-02-08T02:21:43,794 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-singleton-api/15.0.2 2026-02-08T02:21:43,795 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal/mdsal-singleton-impl/15.0.2 2026-02-08T02:21:43,795 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/general-entity/15.0.2 2026-02-08T02:21:43,796 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/ietf-topology/2013.10.21.27.2 2026-02-08T02:21:43,797 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/opendaylight-l2-types/2013.08.27.27.2 2026-02-08T02:21:43,798 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.mdsal.model/yang-ext/2013.09.07.27.2 2026-02-08T02:21:43,798 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/databind/10.0.2 2026-02-08T02:21:43,799 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-dom-api/10.0.2 2026-02-08T02:21:43,800 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/keystore-api/10.0.2 2026-02-08T02:21:43,800 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/keystore-none/10.0.2 2026-02-08T02:21:43,801 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/draft-ietf-restconf-server/10.0.2 2026-02-08T02:21:43,803 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/rfc5277/10.0.2 2026-02-08T02:21:43,803 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf.model/sal-remote/10.0.2 2026-02-08T02:21:43,804 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-api/10.0.2 2026-02-08T02:21:43,805 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/netconf-common-mdsal/10.0.2 2026-02-08T02:21:43,806 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/odl-device-notification/10.0.2 2026-02-08T02:21:43,807 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-api/10.0.2 2026-02-08T02:21:43,807 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-mdsal-spi/10.0.2 2026-02-08T02:21:43,808 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-nb/10.0.2 2026-02-08T02:21:43,809 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server/10.0.2 2026-02-08T02:21:43,810 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-api/10.0.2 2026-02-08T02:21:43,811 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-jaxrs/10.0.2 2026-02-08T02:21:43,812 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-mdsal/10.0.2 2026-02-08T02:21:43,813 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/restconf-server-spi/10.0.2 2026-02-08T02:21:43,814 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/rfc8639-impl/10.0.2 2026-02-08T02:21:43,815 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/sal-remote-impl/10.0.2 2026-02-08T02:21:43,816 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/shaded-sshd/10.0.2 2026-02-08T02:21:43,821 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-api/10.0.2 2026-02-08T02:21:43,821 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-crypto/10.0.2 2026-02-08T02:21:43,822 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-http/10.0.2 2026-02-08T02:21:43,825 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-ssh/10.0.2 2026-02-08T02:21:43,826 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-tcp/10.0.2 2026-02-08T02:21:43,826 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/transport-tls/10.0.2 2026-02-08T02:21:43,828 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/truststore-api/10.0.2 2026-02-08T02:21:43,828 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/truststore-none/10.0.2 2026-02-08T02:21:43,829 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.netconf/yanglib-mdsal-writer/10.0.2 2026-02-08T02:21:43,829 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.odlparent/bundles-diag/14.1.6 2026-02-08T02:21:43,830 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin/0.21.2 2026-02-08T02:21:43,833 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-api/0.21.2 2026-02-08T02:21:43,835 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-api/0.21.2 2026-02-08T02:21:43,835 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/arbitratorreconciliation-impl/0.21.2 2026-02-08T02:21:43,836 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/bulk-o-matic/0.21.2 2026-02-08T02:21:43,837 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/device-ownership-service/0.21.2 2026-02-08T02:21:43,838 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/forwardingrules-manager/0.21.2 2026-02-08T02:21:43,840 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/lldp-speaker/0.21.2 2026-02-08T02:21:43,840 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/of-switch-config-pusher/0.21.2 2026-02-08T02:21:43,841 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/reconciliation-framework/0.21.2 2026-02-08T02:21:43,842 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/topology-lldp-discovery/0.21.2 2026-02-08T02:21:43,843 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.applications/topology-manager/0.21.2 2026-02-08T02:21:43,843 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-blueprint-config/0.21.2 2026-02-08T02:21:43,844 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-common/0.21.2 2026-02-08T02:21:43,845 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-api/0.21.2 2026-02-08T02:21:43,848 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-extension-onf/0.21.2 2026-02-08T02:21:43,849 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/openflowplugin-impl/0.21.2 2026-02-08T02:21:43,853 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.libraries/liblldp/0.21.2 2026-02-08T02:21:43,854 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-base/0.21.2 2026-02-08T02:21:43,860 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-service/0.21.2 2026-02-08T02:21:43,865 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-flow-statistics/0.21.2 2026-02-08T02:21:43,868 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-inventory/0.21.2 2026-02-08T02:21:43,869 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.model/model-topology/0.21.2 2026-02-08T02:21:43,869 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-blueprint-config/0.21.2 2026-02-08T02:21:43,870 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-api/0.21.2 2026-02-08T02:21:43,879 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-impl/0.21.2 2026-02-08T02:21:43,882 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflow-protocol-spi/0.21.2 2026-02-08T02:21:43,883 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin.openflowjava/openflowjava-util/0.21.2 2026-02-08T02:21:43,883 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-api/0.21.2 2026-02-08T02:21:43,884 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-impl/0.21.2 2026-02-08T02:21:43,885 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.openflowplugin/srm-shell/0.21.2 2026-02-08T02:21:43,886 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-api/14.0.20 2026-02-08T02:21:43,886 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-dynamic/14.0.20 2026-02-08T02:21:43,887 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-osgi/14.0.20 2026-02-08T02:21:43,888 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-data-codec-spi/14.0.20 2026-02-08T02:21:43,889 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-generator/14.0.20 2026-02-08T02:21:43,890 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-loader/14.0.20 2026-02-08T02:21:43,890 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-model/14.0.20 2026-02-08T02:21:43,891 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-reflect/14.0.20 2026-02-08T02:21:43,892 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-api/14.0.20 2026-02-08T02:21:43,892 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-osgi/14.0.20 2026-02-08T02:21:43,893 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-runtime-spi/14.0.20 2026-02-08T02:21:43,894 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/binding-spec/14.0.20 2026-02-08T02:21:43,895 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/codegen-extensions/14.0.20 2026-02-08T02:21:43,895 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/concepts/14.0.20 2026-02-08T02:21:43,896 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/odlext-model-api/14.0.20 2026-02-08T02:21:43,897 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/odlext-parser-support/14.0.20 2026-02-08T02:21:43,897 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/openconfig-model-api/14.0.20 2026-02-08T02:21:43,898 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/openconfig-parser-support/14.0.20 2026-02-08T02:21:43,899 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6241-model-api/14.0.20 2026-02-08T02:21:43,899 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6241-parser-support/14.0.20 2026-02-08T02:21:43,900 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6536-model-api/14.0.20 2026-02-08T02:21:43,900 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6536-parser-support/14.0.20 2026-02-08T02:21:43,901 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6643-model-api/14.0.20 2026-02-08T02:21:43,902 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc6643-parser-support/14.0.20 2026-02-08T02:21:43,902 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc7952-model-api/14.0.20 2026-02-08T02:21:43,903 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc7952-parser-support/14.0.20 2026-02-08T02:21:43,904 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8040-model-api/14.0.20 2026-02-08T02:21:43,904 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8040-parser-support/14.0.20 2026-02-08T02:21:43,905 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8528-model-api/14.0.20 2026-02-08T02:21:43,906 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8528-parser-support/14.0.20 2026-02-08T02:21:43,906 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8639-model-api/14.0.20 2026-02-08T02:21:43,907 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8639-parser-support/14.0.20 2026-02-08T02:21:43,907 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8819-model-api/14.0.20 2026-02-08T02:21:43,908 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/rfc8819-parser-support/14.0.20 2026-02-08T02:21:43,909 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/util/14.0.20 2026-02-08T02:21:43,910 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-common/14.0.20 2026-02-08T02:21:43,910 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-common-netty/14.0.20 2026-02-08T02:21:43,911 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-api/14.0.20 2026-02-08T02:21:43,912 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-binfmt/14.0.20 2026-02-08T02:21:43,913 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-gson/14.0.20 2026-02-08T02:21:43,913 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-codec-xml/14.0.20 2026-02-08T02:21:43,914 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-impl/14.0.20 2026-02-08T02:21:43,915 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-spi/14.0.20 2026-02-08T02:21:43,915 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-transform/14.0.20 2026-02-08T02:21:43,916 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-api/14.0.20 2026-02-08T02:21:43,917 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-ri/14.0.20 2026-02-08T02:21:43,918 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-tree-spi/14.0.20 2026-02-08T02:21:43,919 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-data-util/14.0.20 2026-02-08T02:21:43,920 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-ir/14.0.20 2026-02-08T02:21:43,920 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-api/14.0.20 2026-02-08T02:21:43,921 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-export/14.0.20 2026-02-08T02:21:43,922 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-ri/14.0.20 2026-02-08T02:21:43,923 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-spi/14.0.20 2026-02-08T02:21:43,924 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-model-util/14.0.20 2026-02-08T02:21:43,924 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-api/14.0.20 2026-02-08T02:21:43,925 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-impl/14.0.20 2026-02-08T02:21:43,926 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-reactor/14.0.20 2026-02-08T02:21:43,927 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-rfc7950/14.0.20 2026-02-08T02:21:43,928 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-parser-spi/14.0.20 2026-02-08T02:21:43,929 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-api/14.0.20 2026-02-08T02:21:43,930 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-fs/14.0.20 2026-02-08T02:21:43,930 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-repo-spi/14.0.20 2026-02-08T02:21:43,931 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-xpath-api/14.0.20 2026-02-08T02:21:43,932 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.opendaylight.yangtools/yang-xpath-impl/14.0.20 2026-02-08T02:21:43,932 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.url/pax-url-war/2.6.17/jar/uber 2026-02-08T02:21:43,935 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-api/8.0.33 2026-02-08T02:21:43,936 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-compatibility-el2/8.0.33 2026-02-08T02:21:43,937 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-compatibility-servlet31/8.0.33 2026-02-08T02:21:43,937 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-extender-war/8.0.33 2026-02-08T02:21:43,938 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-extender-whiteboard/8.0.33 2026-02-08T02:21:43,939 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-jetty/8.0.33 2026-02-08T02:21:43,939 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-jsp/8.0.33 2026-02-08T02:21:43,942 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-runtime/8.0.33 2026-02-08T02:21:43,943 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-spi/8.0.33 2026-02-08T02:21:43,944 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-tomcat-common/8.0.33 2026-02-08T02:21:43,945 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.ops4j.pax.web/pax-web-websocket/8.0.33 2026-02-08T02:21:43,946 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.osgi/org.osgi.service.component/1.5.1 2026-02-08T02:21:43,947 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.owasp.encoder/encoder/1.3.1 2026-02-08T02:21:43,947 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.reactivestreams/reactive-streams/1.0.4 2026-02-08T02:21:43,948 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:org.codehaus.woodstox/stax2-api/4.2.2 2026-02-08T02:21:43,948 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | mvn:tech.pantheon.triemap/triemap/1.3.2 2026-02-08T02:21:43,949 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap:mvn:org.lmdbjava/lmdbjava/0.9.1 2026-02-08T02:21:43,963 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.1-SNAPSHOT//etc/org.jolokia.osgi.cfg 2026-02-08T02:21:43,968 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.1-SNAPSHOT/configuration/factory/pekko.conf 2026-02-08T02:21:43,969 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.1-SNAPSHOT/etc/org.opendaylight.controller.cluster.datastore.cfg 2026-02-08T02:21:43,969 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.1-SNAPSHOT/etc/jetty-web.xml 2026-02-08T02:21:43,970 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.1-SNAPSHOT/etc/org.opendaylight.openflowplugin.cfg 2026-02-08T02:21:43,970 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.1-SNAPSHOT/etc/opendaylight/datastore/initial/config/default-openflow-connection-config.xml 2026-02-08T02:21:43,970 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.1-SNAPSHOT/etc/opendaylight/datastore/initial/config/legacy-openflow-connection-config.xml 2026-02-08T02:21:43,970 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.1-SNAPSHOT/etc/opendaylight/datastore/initial/config/aaa-cert-config.xml 2026-02-08T02:21:43,972 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.1-SNAPSHOT/etc/opendaylight/datastore/initial/config/aaa-password-service-config.xml 2026-02-08T02:21:43,974 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.1-SNAPSHOT/etc/org.opendaylight.restconf.nb.rfc8040.cfg 2026-02-08T02:21:43,974 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.1-SNAPSHOT/etc/opendaylight/datastore/initial/config/aaa-app-config.xml 2026-02-08T02:21:43,975 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.1-SNAPSHOT/etc/opendaylight/datastore/initial/config/aaa-datastore-config.xml 2026-02-08T02:21:43,975 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.1-SNAPSHOT/bin/idmtool 2026-02-08T02:21:43,975 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Creating configuration file /tmp/karaf-0.23.1-SNAPSHOT//etc/org.opendaylight.aaa.filterchain.cfg 2026-02-08T02:21:43,976 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Refreshing bundles: 2026-02-08T02:21:43,976 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.el-api/3.0.3 (Attached fragments changed: [org.ops4j.pax.web.pax-web-compatibility-el2/8.0.33]) 2026-02-08T02:21:43,976 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.enterprise.cdi-api/2.0.0.SP1 (Wired to javax.el-api/3.0.3 which is being refreshed) 2026-02-08T02:21:43,976 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.transaction-api/1.2.0 (Wired to javax.enterprise.cdi-api/2.0.0.SP1 which is being refreshed) 2026-02-08T02:21:43,976 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.jasypt/1.9.3.1 (Should be wired to: jakarta.servlet-api/4.0.0 (through [org.apache.servicemix.bundles.jasypt/1.9.3.1] osgi.wiring.package; resolution:=optional; filter:="(osgi.wiring.package=javax.servlet)")) 2026-02-08T02:21:43,976 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.servicemix.bundles.javax-inject/1.0.0.3 (Bundle will be uninstalled) 2026-02-08T02:21:43,976 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.config/1.5.7 (Wired to org.apache.servicemix.bundles.jasypt/1.9.3.1 which is being refreshed) 2026-02-08T02:21:43,976 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.jdbc.pool.common/1.5.7 (Wired to javax.transaction-api/1.2.0 which is being refreshed) 2026-02-08T02:21:44,519 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Starting bundles: 2026-02-08T02:21:44,520 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm/9.8.0 2026-02-08T02:21:44,521 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm.tree/9.8.0 2026-02-08T02:21:44,521 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm.tree.analysis/9.8.0 2026-02-08T02:21:44,522 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm.util/9.8.0 2026-02-08T02:21:44,522 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.objectweb.asm.commons/9.8.0 2026-02-08T02:21:44,522 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.proxy/1.1.14 2026-02-08T02:21:44,526 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.quiesce.api/1.0.0 2026-02-08T02:21:44,526 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.blueprint.api/1.0.1 2026-02-08T02:21:44,526 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.blueprint.core/1.10.3 2026-02-08T02:21:44,685 | INFO | features-3-thread-1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.core/1.10.3 has been started 2026-02-08T02:21:44,686 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.blueprint.cm/1.3.2 2026-02-08T02:21:44,696 | INFO | features-3-thread-1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.cm/1.3.2 has been started 2026-02-08T02:21:44,697 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.deployer.wrap/4.4.8 2026-02-08T02:21:44,702 | INFO | fileinstall-/tmp/karaf-0.23.1-SNAPSHOT/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.restconf.nb.rfc8040} from /tmp/karaf-0.23.1-SNAPSHOT/etc/org.opendaylight.restconf.nb.rfc8040.cfg 2026-02-08T02:21:44,703 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.deployer.blueprint/4.4.8 2026-02-08T02:21:44,705 | INFO | fileinstall-/tmp/karaf-0.23.1-SNAPSHOT/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.aaa.filterchain} from /tmp/karaf-0.23.1-SNAPSHOT/etc/org.opendaylight.aaa.filterchain.cfg 2026-02-08T02:21:44,707 | INFO | fileinstall-/tmp/karaf-0.23.1-SNAPSHOT/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.jolokia.osgi} from /tmp/karaf-0.23.1-SNAPSHOT/etc/org.jolokia.osgi.cfg 2026-02-08T02:21:44,708 | INFO | fileinstall-/tmp/karaf-0.23.1-SNAPSHOT/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.controller.cluster.datastore} from /tmp/karaf-0.23.1-SNAPSHOT/etc/org.opendaylight.controller.cluster.datastore.cfg 2026-02-08T02:21:44,709 | INFO | fileinstall-/tmp/karaf-0.23.1-SNAPSHOT/etc | fileinstall | 6 - org.apache.felix.fileinstall - 3.7.4 | Creating configuration {org.opendaylight.openflowplugin} from /tmp/karaf-0.23.1-SNAPSHOT/etc/org.opendaylight.openflowplugin.cfg 2026-02-08T02:21:44,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.deployer.kar/4.4.8 2026-02-08T02:21:44,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.deployer.features/4.4.8 2026-02-08T02:21:44,743 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.sshd.osgi/2.15.0 2026-02-08T02:21:44,744 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.sshd.scp/2.15.0 2026-02-08T02:21:44,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.sshd.sftp/2.15.0 2026-02-08T02:21:44,745 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.jline/3.21.0 2026-02-08T02:21:44,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.shell.core/4.4.8 2026-02-08T02:21:44,771 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.deployer.kar/4.4.8 2026-02-08T02:21:44,772 | INFO | features-3-thread-1 | Activator | 121 - org.apache.karaf.shell.core - 4.4.8 | Not starting local console. To activate set karaf.startLocalConsole=true 2026-02-08T02:21:44,804 | INFO | features-3-thread-1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.karaf.shell.core/4.4.8 has been started 2026-02-08T02:21:44,804 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.bundle.core/4.4.8 2026-02-08T02:21:44,827 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.bundle.core/4.4.8 2026-02-08T02:21:44,828 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.osgi.service.component/1.5.1.202212101352 2026-02-08T02:21:44,829 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.felix.scr/2.2.6 2026-02-08T02:21:44,837 | INFO | features-3-thread-1 | ROOT | 94 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (94) Starting with globalExtender setting: false 2026-02-08T02:21:44,841 | INFO | features-3-thread-1 | ROOT | 94 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (94) Version = 2.2.6 2026-02-08T02:21:44,850 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.scr.state/4.4.8 2026-02-08T02:21:44,882 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.servlet-api/4.0.0 2026-02-08T02:21:44,883 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-api/8.0.33 2026-02-08T02:21:44,884 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.websocket-api/1.1.2 2026-02-08T02:21:44,885 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-spi/8.0.33 2026-02-08T02:21:44,885 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.util/9.4.57.v20241219 2026-02-08T02:21:44,886 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.jmx/9.4.57.v20241219 2026-02-08T02:21:44,886 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.io/9.4.57.v20241219 2026-02-08T02:21:44,886 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.http/9.4.57.v20241219 2026-02-08T02:21:44,887 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.server/9.4.57.v20241219 2026-02-08T02:21:44,887 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.security/9.4.57.v20241219 2026-02-08T02:21:44,887 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.util.ajax/9.4.57.v20241219 2026-02-08T02:21:44,888 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.servlet/9.4.57.v20241219 2026-02-08T02:21:44,888 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.xml/9.4.57.v20241219 2026-02-08T02:21:44,891 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.jaas/9.4.57.v20241219 2026-02-08T02:21:44,891 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.servlets/9.4.57.v20241219 2026-02-08T02:21:44,892 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-jetty/8.0.33 2026-02-08T02:21:44,904 | INFO | features-3-thread-1 | log | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | Logging initialized @8063ms to org.eclipse.jetty.util.log.Slf4jLog 2026-02-08T02:21:44,913 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.el-api/3.0.3 2026-02-08T02:21:44,913 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jdt.core.compiler.batch/3.26.0.v20210609-0549 2026-02-08T02:21:44,915 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-jsp/8.0.33 2026-02-08T02:21:44,916 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-tomcat-common/8.0.33 2026-02-08T02:21:44,917 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-runtime/8.0.33 2026-02-08T02:21:44,930 | INFO | features-3-thread-1 | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | EventAdmin support enabled, WAB events will be posted to EventAdmin topics. 2026-02-08T02:21:44,930 | INFO | features-3-thread-1 | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Pax Web Runtime started 2026-02-08T02:21:44,930 | INFO | CM Configuration Updater (ManagedService Update: pid=[org.ops4j.pax.web]) | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Scheduling Pax Web reconfiguration because configuration has changed 2026-02-08T02:21:44,936 | INFO | paxweb-config-1-thread-1 (change config) | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Scheduling Pax Web reconfiguration because ServerControllerFactory has been registered 2026-02-08T02:21:44,948 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.http.core/4.4.8 2026-02-08T02:21:44,958 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.http.core/4.4.8. Missing service: [org.apache.karaf.http.core.ProxyService] 2026-02-08T02:21:44,958 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.continuation/9.4.57.v20241219 2026-02-08T02:21:44,959 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-extender-whiteboard/8.0.33 2026-02-08T02:21:44,960 | INFO | features-3-thread-1 | Activator | 394 - org.ops4j.pax.web.pax-web-extender-whiteboard - 8.0.33 | Starting Pax Web Whiteboard Extender 2026-02-08T02:21:44,977 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.bundle.blueprintstate/4.4.8 2026-02-08T02:21:44,994 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.jaas.config/4.4.8 2026-02-08T02:21:45,000 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.jaas.modules/4.4.8 2026-02-08T02:21:45,006 | INFO | paxweb-config-1-thread-1 (change controller) | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Configuring server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2026-02-08T02:21:45,006 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Configuring JettyServerController{configuration=97f398a5-edea-4aa4-884a-a188f7542c8f,state=UNCONFIGURED} 2026-02-08T02:21:45,008 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating Jetty server instance using configuration properties. 2026-02-08T02:21:45,010 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.shell.ssh/4.4.8 2026-02-08T02:21:45,041 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Processing Jetty configuration from files: [etc/jetty.xml] 2026-02-08T02:21:45,080 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.shell.ssh/4.4.8. Missing service: [org.apache.sshd.server.SshServer] 2026-02-08T02:21:45,081 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.diagnostic.core/4.4.8 2026-02-08T02:21:45,093 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.diagnostic.core/4.4.8 2026-02-08T02:21:45,093 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.url.war/2.6.17 2026-02-08T02:21:45,130 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.features.command/4.4.8 2026-02-08T02:21:45,143 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.features.command/4.4.8 2026-02-08T02:21:45,144 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.kar.core/4.4.8 2026-02-08T02:21:45,196 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.kar.core/4.4.8. Missing service: [org.apache.karaf.kar.KarService] 2026-02-08T02:21:45,197 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.service.core/4.4.8 2026-02-08T02:21:45,297 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Found configured connector "jetty-default": 0.0.0.0:8181 2026-02-08T02:21:45,298 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Using configured jetty-default@2f82e13b{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} as non secure connector for address: 0.0.0.0:8181 2026-02-08T02:21:45,298 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Eagerly starting Jetty thread pool QueuedThreadPool[qtp1021661463]@3ce55117{STOPPED,0<=0<=200,i=0,r=-1,q=0}[NO_TRY] 2026-02-08T02:21:45,447 | INFO | activator-1-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.shell.ssh/4.4.8 2026-02-08T02:21:45,449 | INFO | activator-1-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.kar.core/4.4.8 2026-02-08T02:21:45,449 | INFO | paxweb-config-1-thread-1 (change controller) | JettyFactory | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding JMX support to Jetty server 2026-02-08T02:21:45,460 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.service.core/4.4.8 2026-02-08T02:21:45,460 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.web.core/4.4.8 2026-02-08T02:21:45,461 | INFO | activator-1-thread-1 | DefaultIoServiceFactoryFactory | 126 - org.apache.sshd.osgi - 2.15.0 | No detected/configured IoServiceFactoryFactory; using Nio2ServiceFactoryFactory 2026-02-08T02:21:45,467 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.web.core/4.4.8. Missing service: [org.apache.karaf.web.WebContainerService] 2026-02-08T02:21:45,467 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.management.server/4.4.8 2026-02-08T02:21:45,471 | INFO | activator-1-thread-1 | Activator | 114 - org.apache.karaf.management.server - 4.4.8 | Setting java.rmi.server.hostname system property to 127.0.0.1 2026-02-08T02:21:45,473 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.scr.management/4.4.8 2026-02-08T02:21:45,476 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.shell.table/4.4.8 2026-02-08T02:21:45,476 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-websocket/8.0.33 2026-02-08T02:21:45,476 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.jaas.command/4.4.8 2026-02-08T02:21:45,489 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.jaas.command/4.4.8 2026-02-08T02:21:45,491 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.8 2026-02-08T02:21:45,492 | INFO | paxweb-config-1-thread-1 (change controller) | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Starting server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2026-02-08T02:21:45,492 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.8 2026-02-08T02:21:45,492 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting JettyServerController{configuration=97f398a5-edea-4aa4-884a-a188f7542c8f,state=STOPPED} 2026-02-08T02:21:45,492 | INFO | paxweb-config-1-thread-1 (change controller) | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Server@5b85c649{STOPPED}[9.4.57.v20241219] 2026-02-08T02:21:45,493 | INFO | paxweb-config-1-thread-1 (change controller) | Server | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | jetty-9.4.57.v20241219; built: 2025-01-08T21:24:30.412Z; git: df524e6b29271c2e09ba9aea83c18dc9db464a31; jvm 21.0.9+10-Ubuntu-122.04 2026-02-08T02:21:45,493 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.blueprint.api/1.2.0 2026-02-08T02:21:45,496 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.blueprint.core/1.2.0 2026-02-08T02:21:45,498 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.eclipse.jetty.client/9.4.57.v20241219 2026-02-08T02:21:45,498 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.whiteboard/1.2.0 2026-02-08T02:21:45,509 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.log.core/4.4.8 2026-02-08T02:21:45,521 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.log.core/4.4.8. Missing service: [org.apache.karaf.log.core.LogService] 2026-02-08T02:21:45,524 | INFO | paxweb-config-1-thread-1 (change controller) | session | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | DefaultSessionIdManager workerName=node0 2026-02-08T02:21:45,524 | INFO | paxweb-config-1-thread-1 (change controller) | session | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | No SessionScavenger set, using defaults 2026-02-08T02:21:45,525 | INFO | paxweb-config-1-thread-1 (change controller) | session | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | node0 Scavenging every 660000ms 2026-02-08T02:21:45,526 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.shell.commands/4.4.8 2026-02-08T02:21:45,536 | INFO | activator-1-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.log.core/4.4.8 2026-02-08T02:21:45,547 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.shell.commands/4.4.8 2026-02-08T02:21:45,548 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.shell.commands/4.4.8 2026-02-08T02:21:45,549 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.package.core/4.4.8 2026-02-08T02:21:45,556 | INFO | activator-1-thread-1 | ServiceComponentRuntimeMBeanImpl | 116 - org.apache.karaf.scr.management - 4.4.8 | Activating the Apache Karaf ServiceComponentRuntime MBean 2026-02-08T02:21:45,567 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.package.core/4.4.8 2026-02-08T02:21:45,567 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.system.core/4.4.8 2026-02-08T02:21:45,579 | INFO | activator-1-thread-1 | ServiceComponentRuntimeMBeanImpl | 116 - org.apache.karaf.scr.management - 4.4.8 | Deactivating the Apache Karaf ServiceComponentRuntime MBean 2026-02-08T02:21:45,584 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.system.core/4.4.8 2026-02-08T02:21:45,585 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.config.command/4.4.8 2026-02-08T02:21:45,585 | INFO | paxweb-config-1-thread-1 (change controller) | AbstractConnector | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started jetty-default@2f82e13b{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} 2026-02-08T02:21:45,586 | INFO | paxweb-config-1-thread-1 (change controller) | Server | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started @8748ms 2026-02-08T02:21:45,587 | INFO | paxweb-config-1-thread-1 (change controller) | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering HttpService factory 2026-02-08T02:21:45,588 | INFO | paxweb-config-1-thread-1 (change controller) | StoppableHttpServiceFactory | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.apache.karaf.http.core_4.4.8 [106]] 2026-02-08T02:21:45,597 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.config.command/4.4.8 2026-02-08T02:21:45,603 | INFO | paxweb-config-1-thread-1 (change controller) | StoppableHttpServiceFactory | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.apache.karaf.web.core_4.4.8 [125]] 2026-02-08T02:21:45,603 | INFO | HttpService->Whiteboard (add HttpService) | StoppableHttpServiceFactory | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-whiteboard_8.0.33 [394]] 2026-02-08T02:21:45,610 | INFO | activator-1-thread-2 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.http.core/4.4.8 2026-02-08T02:21:45,614 | INFO | paxweb-config-1-thread-1 (change controller) | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering HttpServiceRuntime 2026-02-08T02:21:45,619 | INFO | paxweb-config-1-thread-1 | ServerModel | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-3,contextPath='/'} 2026-02-08T02:21:45,619 | INFO | paxweb-config-1-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)}", size=2} 2026-02-08T02:21:45,620 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-3,contextPath='/'} 2026-02-08T02:21:45,630 | INFO | activator-1-thread-2 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.web.core/4.4.8 2026-02-08T02:21:45,654 | WARN | activator-1-thread-1 | Activator | 114 - org.apache.karaf.management.server - 4.4.8 | java.rmi.server.hostname system property is already set to 127.0.0.1. Apache Karaf doesn't override it 2026-02-08T02:21:45,657 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.api/1.1.5 2026-02-08T02:21:45,661 | INFO | activator-1-thread-1 | ServiceComponentRuntimeMBeanImpl | 116 - org.apache.karaf.scr.management - 4.4.8 | Activating the Apache Karaf ServiceComponentRuntime MBean 2026-02-08T02:21:45,662 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.jmx.core/1.1.8 2026-02-08T02:21:45,665 | INFO | features-3-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Starting JMX OSGi agent 2026-02-08T02:21:45,676 | INFO | features-3-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=91557cc4-1729-410d-a1d6-cd18b42b7e34] for service with service.id [15] 2026-02-08T02:21:45,677 | INFO | features-3-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=91557cc4-1729-410d-a1d6-cd18b42b7e34] for service with service.id [39] 2026-02-08T02:21:45,679 | INFO | features-3-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.cm.ConfigurationAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@601ad09b with name osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=91557cc4-1729-410d-a1d6-cd18b42b7e34 2026-02-08T02:21:45,679 | INFO | features-3-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.ServiceStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@601ad09b with name osgi.core:type=serviceState,version=1.7,framework=org.eclipse.osgi,uuid=91557cc4-1729-410d-a1d6-cd18b42b7e34 2026-02-08T02:21:45,680 | INFO | features-3-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.permissionadmin.PermissionAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@601ad09b with name osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=91557cc4-1729-410d-a1d6-cd18b42b7e34 2026-02-08T02:21:45,680 | INFO | features-3-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.PackageStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@601ad09b with name osgi.core:type=packageState,version=1.5,framework=org.eclipse.osgi,uuid=91557cc4-1729-410d-a1d6-cd18b42b7e34 2026-02-08T02:21:45,680 | INFO | features-3-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.BundleStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@601ad09b with name osgi.core:type=bundleState,version=1.7,framework=org.eclipse.osgi,uuid=91557cc4-1729-410d-a1d6-cd18b42b7e34 2026-02-08T02:21:45,681 | INFO | features-3-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.FrameworkMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@601ad09b with name osgi.core:type=framework,version=1.7,framework=org.eclipse.osgi,uuid=91557cc4-1729-410d-a1d6-cd18b42b7e34 2026-02-08T02:21:45,681 | INFO | features-3-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.wiring.BundleWiringStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@601ad09b with name osgi.core:type=wiringState,version=1.1,framework=org.eclipse.osgi,uuid=91557cc4-1729-410d-a1d6-cd18b42b7e34 2026-02-08T02:21:45,681 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)} to o.o.p.w.s.j.i.PaxWebServletContextHandler@14cebc80{/,null,STOPPED} 2026-02-08T02:21:45,683 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.ops4j.pax.web.pax-web-extender-war/8.0.33 2026-02-08T02:21:45,684 | INFO | features-3-thread-1 | Activator | 393 - org.ops4j.pax.web.pax-web-extender-war - 8.0.33 | Configuring WAR extender thread pool. Pool size = 3 2026-02-08T02:21:45,689 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@14cebc80{/,null,STOPPED} 2026-02-08T02:21:45,755 | INFO | HttpService->WarExtender (add HttpService) | StoppableHttpServiceFactory | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-war_8.0.33 [393]] 2026-02-08T02:21:45,761 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.instance.core/4.4.8 2026-02-08T02:21:45,780 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.instance.core/4.4.8 2026-02-08T02:21:45,782 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.blueprint/12.0.3 2026-02-08T02:21:45,784 | INFO | features-3-thread-1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Starting BlueprintBundleTracker 2026-02-08T02:21:45,793 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.apache.karaf.shell.core_4.4.8 [121] was successfully created 2026-02-08T02:21:45,793 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.apache.aries.blueprint.cm_1.3.2 [79] was successfully created 2026-02-08T02:21:45,793 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.apache.aries.blueprint.core_1.10.3 [80] was successfully created 2026-02-08T02:21:45,987 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.google.guava.failureaccess/1.0.3 2026-02-08T02:21:45,989 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.jspecify.jspecify/1.0.0 2026-02-08T02:21:45,989 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.google.guava/33.5.0.jre 2026-02-08T02:21:45,991 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.concepts/14.0.20 2026-02-08T02:21:45,992 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | triemap/1.3.2 2026-02-08T02:21:45,995 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.util/14.0.20 2026-02-08T02:21:45,997 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-common/14.0.20 2026-02-08T02:21:45,998 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-common-api/15.0.2 2026-02-08T02:21:45,999 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-xpath-api/14.0.20 2026-02-08T02:21:46,000 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-api/14.0.20 2026-02-08T02:21:46,001 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-api/14.0.20 2026-02-08T02:21:46,002 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-tree-api/14.0.20 2026-02-08T02:21:46,003 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-dom-api/15.0.2 2026-02-08T02:21:46,004 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.dom-api/10.0.2 2026-02-08T02:21:46,005 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.annotation-api/1.3.5 2026-02-08T02:21:46,009 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.geronimo.specs.geronimo-atinject_1.0_spec/1.2.0 2026-02-08T02:21:46,010 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | checker-qual/3.51.1 2026-02-08T02:21:46,011 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.util/7.1.9 2026-02-08T02:21:46,012 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-spec/14.0.20 2026-02-08T02:21:46,013 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-binding-api/15.0.2 2026-02-08T02:21:46,014 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-singleton-api/15.0.2 2026-02-08T02:21:46,015 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc8342-ietf-datastores/1.0.2 2026-02-08T02:21:46,016 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.ietf-type-util/1.0.2 2026-02-08T02:21:46,017 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-reflect/14.0.20 2026-02-08T02:21:46,017 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc6991-ietf-inet-types/1.0.2 2026-02-08T02:21:46,018 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc6991-ietf-yang-types/1.0.2 2026-02-08T02:21:46,018 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.model.yang-ext/2013.9.7.27_2 2026-02-08T02:21:46,019 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.inventory/0.21.2 2026-02-08T02:21:46,020 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.model.opendaylight-l2-types/2013.8.27.27_2 2026-02-08T02:21:46,020 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.codegen-extensions/14.0.20 2026-02-08T02:21:46,020 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.openflow-protocol-api/0.21.2 2026-02-08T02:21:46,022 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.flow-base/0.21.2 2026-02-08T02:21:46,023 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.flow-service/0.21.2 2026-02-08T02:21:46,024 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javax.servlet-api/3.1.0 2026-02-08T02:21:46,024 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.web.api/0.22.3 2026-02-08T02:21:46,025 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.typesafe.config/1.4.5 2026-02-08T02:21:46,025 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.scala3-library/12.0.3 2026-02-08T02:21:46,026 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.typesafe.sslconfig/0.6.1 2026-02-08T02:21:46,027 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.aeron.annotations/1.45.1 2026-02-08T02:21:46,033 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.agrona.core/1.22.0 2026-02-08T02:21:46,034 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.aeron.client/1.45.1 2026-02-08T02:21:46,035 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.aeron.driver/1.45.1 2026-02-08T02:21:46,036 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.resolver/4.2.7.Final 2026-02-08T02:21:46,037 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.transport/4.2.7.Final 2026-02-08T02:21:46,037 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.codec-base/4.2.7.Final 2026-02-08T02:21:46,038 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.transport-native-unix-common/4.2.7.Final 2026-02-08T02:21:46,038 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.handler/4.2.7.Final 2026-02-08T02:21:46,039 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | wrap_file__tmp_karaf-0.23.1-SNAPSHOT_system_org_lmdbjava_lmdbjava_0.9.1_lmdbjava-0.9.1.jar/0.0.0 2026-02-08T02:21:46,039 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | reactive-streams/1.0.4 2026-02-08T02:21:46,040 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.repackaged-pekko/12.0.3 2026-02-08T02:21:46,044 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-spi/14.0.20 2026-02-08T02:21:46,045 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8528-model-api/14.0.20 2026-02-08T02:21:46,045 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8040-model-api/14.0.20 2026-02-08T02:21:46,046 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc7952-model-api/14.0.20 2026-02-08T02:21:46,046 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-ir/14.0.20 2026-02-08T02:21:46,047 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-spi/14.0.20 2026-02-08T02:21:46,047 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-util/14.0.20 2026-02-08T02:21:46,048 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-util/14.0.20 2026-02-08T02:21:46,049 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-impl/14.0.20 2026-02-08T02:21:46,049 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-tree-spi/14.0.20 2026-02-08T02:21:46,049 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-codec-binfmt/14.0.20 2026-02-08T02:21:46,050 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.cds-access-api/12.0.3 2026-02-08T02:21:46,050 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.collections/3.2.2 2026-02-08T02:21:46,051 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.commons-beanutils/1.11.0 2026-02-08T02:21:46,052 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.owasp.encoder/1.3.1 2026-02-08T02:21:46,052 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.repackaged-shiro/0.22.3 2026-02-08T02:21:46,053 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-model/14.0.20 2026-02-08T02:21:46,053 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-repo-api/14.0.20 2026-02-08T02:21:46,054 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-runtime-api/14.0.20 2026-02-08T02:21:46,054 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-api/14.0.20 2026-02-08T02:21:46,055 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.odlext-model-api/14.0.20 2026-02-08T02:21:46,055 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-spi/14.0.20 2026-02-08T02:21:46,056 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.odlext-parser-support/14.0.20 2026-02-08T02:21:46,061 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.openconfig-model-api/14.0.20 2026-02-08T02:21:46,061 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.openconfig-parser-support/14.0.20 2026-02-08T02:21:46,063 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6241-model-api/14.0.20 2026-02-08T02:21:46,063 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6241-parser-support/14.0.20 2026-02-08T02:21:46,065 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6536-model-api/14.0.20 2026-02-08T02:21:46,066 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6536-parser-support/14.0.20 2026-02-08T02:21:46,067 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6643-model-api/14.0.20 2026-02-08T02:21:46,068 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc6643-parser-support/14.0.20 2026-02-08T02:21:46,069 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-ri/14.0.20 2026-02-08T02:21:46,071 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc7952-parser-support/14.0.20 2026-02-08T02:21:46,073 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8040-parser-support/14.0.20 2026-02-08T02:21:46,078 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8528-parser-support/14.0.20 2026-02-08T02:21:46,081 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8639-model-api/14.0.20 2026-02-08T02:21:46,082 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8639-parser-support/14.0.20 2026-02-08T02:21:46,084 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8819-model-api/14.0.20 2026-02-08T02:21:46,091 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.rfc8819-parser-support/14.0.20 2026-02-08T02:21:46,094 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-repo-spi/14.0.20 2026-02-08T02:21:46,095 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.antlr.antlr4-runtime/4.13.2 2026-02-08T02:21:46,095 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-reactor/14.0.20 2026-02-08T02:21:46,096 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-rfc7950/14.0.20 2026-02-08T02:21:46,097 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-xpath-impl/14.0.20 2026-02-08T02:21:46,100 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-parser-impl/14.0.20 2026-02-08T02:21:46,103 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-runtime-spi/14.0.20 2026-02-08T02:21:46,104 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-generator/14.0.20 2026-02-08T02:21:46,109 | INFO | features-3-thread-1 | DefaultBindingRuntimeGenerator | 329 - org.opendaylight.yangtools.binding-generator - 14.0.20 | Binding/YANG type support activated 2026-02-08T02:21:46,109 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-runtime-osgi/14.0.20 2026-02-08T02:21:46,116 | INFO | features-3-thread-1 | OSGiBindingRuntime | 334 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.20 | Binding Runtime activated 2026-02-08T02:21:46,121 | INFO | features-3-thread-1 | OSGiModelRuntime | 334 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.20 | Model Runtime starting 2026-02-08T02:21:46,155 | INFO | features-3-thread-1 | KarafFeaturesSupport | 334 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.20 | Will attempt to integrate with Karaf FeaturesService 2026-02-08T02:21:46,728 | INFO | features-3-thread-1 | NettyTransportSupport | 284 - org.opendaylight.netconf.transport-api - 10.0.2 | Netty transport backed by epoll(2) 2026-02-08T02:21:47,025 | INFO | features-3-thread-1 | SharedEffectiveModelContextFactory | 380 - org.opendaylight.yangtools.yang-parser-impl - 14.0.20 | Using weak references 2026-02-08T02:21:48,824 | INFO | features-3-thread-1 | OSGiModuleInfoSnapshotImpl | 334 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.20 | EffectiveModelContext generation 1 activated 2026-02-08T02:21:49,564 | INFO | features-3-thread-1 | OSGiBindingRuntimeContextImpl | 334 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.20 | BindingRuntimeContext generation 1 activated 2026-02-08T02:21:49,565 | INFO | features-3-thread-1 | GlobalBindingRuntimeContext | 334 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.20 | Global BindingRuntimeContext generation 1 activated 2026-02-08T02:21:49,565 | INFO | features-3-thread-1 | OSGiModelRuntime | 334 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.20 | Model Runtime started 2026-02-08T02:21:49,566 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.h2database/2.3.232 2026-02-08T02:21:49,576 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc8341/1.0.2 2026-02-08T02:21:49,577 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc9640/1.0.2 2026-02-08T02:21:49,578 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc9642/1.0.2 2026-02-08T02:21:49,578 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.iana-tls-cipher-suite-algs/1.0.2 2026-02-08T02:21:49,579 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc9645-ietf-tls-common/1.0.2 2026-02-08T02:21:49,580 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc9641/1.0.2 2026-02-08T02:21:49,580 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc9645-ietf-tls-client/1.0.2 2026-02-08T02:21:49,581 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-cluster-admin-api/12.0.3 2026-02-08T02:21:49,581 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.lang3/3.19.0 2026-02-08T02:21:49,582 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.core/4.2.37 2026-02-08T02:21:49,583 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.jmx/4.2.37 2026-02-08T02:21:49,583 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | lz4-java/1.8.0 2026-02-08T02:21:49,584 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.raft-api/12.0.3 2026-02-08T02:21:49,584 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.raft-spi/12.0.3 2026-02-08T02:21:49,588 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-clustering-commons/12.0.3 2026-02-08T02:21:49,593 | INFO | features-3-thread-1 | FileAkkaConfigurationReader | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | File-based Pekko configuration reader enabled 2026-02-08T02:21:49,593 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.cds-mgmt-api/12.0.3 2026-02-08T02:21:49,596 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.raft-journal/12.0.3 2026-02-08T02:21:49,597 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-akka-raft/12.0.3 2026-02-08T02:21:49,598 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.shiro-api/0.22.3 2026-02-08T02:21:49,599 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc6241/1.0.2 2026-02-08T02:21:49,599 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.transport-classes-epoll/4.2.7.Final 2026-02-08T02:21:49,600 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.google.gson/2.13.2 2026-02-08T02:21:49,601 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.ready-api/7.1.9 2026-02-08T02:21:49,601 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.diagstatus-api/7.1.9 2026-02-08T02:21:49,602 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.openflow-protocol-spi/0.21.2 2026-02-08T02:21:49,603 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.util/0.21.2 2026-02-08T02:21:49,604 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-common-netty/14.0.20 2026-02-08T02:21:49,606 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.odlparent.bundles-diag/14.1.6 2026-02-08T02:21:49,609 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.ready-impl/7.1.9 2026-02-08T02:21:49,621 | INFO | features-3-thread-1 | KarafSystemReady | 242 - org.opendaylight.infrautils.ready-impl - 7.1.9 | ThreadFactory for SystemReadyService created 2026-02-08T02:21:49,623 | INFO | features-3-thread-1 | KarafSystemReady | 242 - org.opendaylight.infrautils.ready-impl - 7.1.9 | Now starting to provide full system readiness status updates (see TestBundleDiag's logs)... 2026-02-08T02:21:49,625 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.diagstatus-impl/7.1.9 2026-02-08T02:21:49,625 | INFO | SystemReadyService-0 | KarafSystemReady | 242 - org.opendaylight.infrautils.ready-impl - 7.1.9 | checkBundleDiagInfos() started... 2026-02-08T02:21:49,643 | INFO | features-3-thread-1 | DiagStatusServiceImpl | 239 - org.opendaylight.infrautils.diagstatus-impl - 7.1.9 | Diagnostic Status Service started 2026-02-08T02:21:49,647 | INFO | features-3-thread-1 | MBeanUtils | 238 - org.opendaylight.infrautils.diagstatus-api - 7.1.9 | MBean registration for org.opendaylight.infrautils.diagstatus:type=SvcStatus SUCCESSFUL. 2026-02-08T02:21:49,647 | INFO | features-3-thread-1 | DiagStatusServiceMBeanImpl | 239 - org.opendaylight.infrautils.diagstatus-impl - 7.1.9 | Diagnostic Status Service management started 2026-02-08T02:21:49,648 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl/0.21.2 2026-02-08T02:21:49,652 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.flow-statistics/0.21.2 2026-02-08T02:21:49,652 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.extension-api/0.21.2 2026-02-08T02:21:49,653 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin/0.21.2 2026-02-08T02:21:49,654 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc9643-ietf-tcp-common/1.0.2 2026-02-08T02:21:49,654 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc9643-ietf-tcp-server/1.0.2 2026-02-08T02:21:49,654 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.shaded-sshd/10.0.2 2026-02-08T02:21:49,655 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.arbitratorreconciliation-api/0.21.2 2026-02-08T02:21:49,655 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc6243/1.0.2 2026-02-08T02:21:49,656 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-api/10.0.2 2026-02-08T02:21:49,656 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc8343/1.0.2 2026-02-08T02:21:49,657 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc8344/1.0.2 2026-02-08T02:21:49,657 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc8528/1.0.2 2026-02-08T02:21:49,657 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc8529/1.0.2 2026-02-08T02:21:49,658 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc8040-ietf-restconf/1.0.2 2026-02-08T02:21:49,658 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc8639/1.0.2 2026-02-08T02:21:49,659 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc9645-ietf-tls-server/1.0.2 2026-02-08T02:21:49,659 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.validation.jakarta.validation-api/2.0.2 2026-02-08T02:21:49,659 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.password-service-api/0.22.3 2026-02-08T02:21:49,659 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.password-service-impl/0.22.3 2026-02-08T02:21:49,662 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | net.bytebuddy.byte-buddy/1.17.8 2026-02-08T02:21:49,664 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.aries.util/1.1.3 2026-02-08T02:21:49,665 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-binding-spi/15.0.2 2026-02-08T02:21:49,665 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.core.jersey-client/2.47.0 2026-02-08T02:21:49,665 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.media.jersey-media-sse/2.47.0 2026-02-08T02:21:49,666 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.external.aopalliance-repackaged/2.6.1 2026-02-08T02:21:49,666 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.osgi-resource-locator/1.0.3 2026-02-08T02:21:49,713 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.model.ietf-topology/2013.10.21.27_2 2026-02-08T02:21:49,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc9643-ietf-tcp-client/1.0.2 2026-02-08T02:21:49,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.netconf-common-mdsal/10.0.2 2026-02-08T02:21:49,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-codec-gson/14.0.20 2026-02-08T02:21:49,715 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | stax2-api/4.2.2 2026-02-08T02:21:49,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-codec-xml/14.0.20 2026-02-08T02:21:49,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.databind/10.0.2 2026-02-08T02:21:49,716 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.netconf-api/10.0.2 2026-02-08T02:21:49,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc8072/1.0.2 2026-02-08T02:21:49,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server-api/10.0.2 2026-02-08T02:21:49,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-model-export/14.0.20 2026-02-08T02:21:49,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server-spi/10.0.2 2026-02-08T02:21:49,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-mdsal-spi/10.0.2 2026-02-08T02:21:49,718 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc8040-ietf-restconf-monitoring/1.0.2 2026-02-08T02:21:49,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc8650/1.0.2 2026-02-08T02:21:49,719 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-dom-spi/15.0.2 2026-02-08T02:21:49,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-dom-schema-osgi/15.0.2 2026-02-08T02:21:49,726 | INFO | features-3-thread-1 | OSGiDOMSchemaService | 252 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 15.0.2 | DOM Schema services activated 2026-02-08T02:21:49,727 | INFO | features-3-thread-1 | OSGiDOMSchemaService | 252 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 15.0.2 | Updating context to generation 1 2026-02-08T02:21:49,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-dom-broker/15.0.2 2026-02-08T02:21:49,735 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.text/1.14.0 2026-02-08T02:21:49,736 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.cds-access-client/12.0.3 2026-02-08T02:21:49,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-common-util/12.0.3 2026-02-08T02:21:49,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-data-codec-api/14.0.20 2026-02-08T02:21:49,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-data-codec-spi/14.0.20 2026-02-08T02:21:49,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-loader/14.0.20 2026-02-08T02:21:49,738 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-data-codec-dynamic/14.0.20 2026-02-08T02:21:49,741 | INFO | features-3-thread-1 | SimpleBindingDOMCodecFactory | 326 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.20 | Binding/DOM Codec enabled 2026-02-08T02:21:49,741 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.binding-data-codec-osgi/14.0.20 2026-02-08T02:21:49,747 | INFO | features-3-thread-1 | OSGiBindingDOMCodec | 327 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.20 | Binding/DOM Codec activated 2026-02-08T02:21:49,766 | INFO | features-3-thread-1 | OSGiBindingDOMCodecServicesImpl | 327 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.20 | Binding/DOM Codec generation 1 activated 2026-02-08T02:21:49,766 | INFO | features-3-thread-1 | GlobalBindingDOMCodecServices | 327 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.20 | Global Binding/DOM Codec activated with generation 1 2026-02-08T02:21:49,769 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-tree-ri/14.0.20 2026-02-08T02:21:49,850 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-distributed-datastore/12.0.3 2026-02-08T02:21:49,863 | INFO | features-3-thread-1 | OSGiActorSystemProvider | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Actor System provider starting 2026-02-08T02:21:50,038 | INFO | features-3-thread-1 | ActorSystemProviderImpl | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Creating new ActorSystem 2026-02-08T02:21:50,372 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Slf4jLogger | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Slf4jLogger started 2026-02-08T02:21:50,605 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | ArteryTransport | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Remoting started with transport [Artery tcp]; listening on address [pekko://opendaylight-cluster-data@10.30.170.226:2550] with UID [2601843136995926979] 2026-02-08T02:21:50,618 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Starting up, Pekko version [1.2.1] ... 2026-02-08T02:21:50,667 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Registered cluster JMX MBean [pekko:type=Cluster] 2026-02-08T02:21:50,667 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Started up successfully 2026-02-08T02:21:50,717 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR started. Config: strategy [KeepMajority], stable-after [7 seconds], down-all-when-unstable [5250 milliseconds], selfUniqueAddress [pekko://opendaylight-cluster-data@10.30.170.226:2550#2601843136995926979], selfDc [default]. 2026-02-08T02:21:50,889 | INFO | features-3-thread-1 | OSGiActorSystemProvider | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Actor System provider started 2026-02-08T02:21:50,906 | INFO | features-3-thread-1 | OSGiDatastoreContextIntrospectorFactory | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Datastore Context Introspector activated 2026-02-08T02:21:50,909 | INFO | features-3-thread-1 | FileModuleShardConfigProvider | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Shard configuration provider started 2026-02-08T02:21:50,914 | INFO | features-3-thread-1 | OSGiDistributedDataStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Distributed Datastore type CONFIGURATION starting 2026-02-08T02:21:50,944 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.87:2550], control stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.87/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:21:50,946 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.87:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.87/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:21:50,988 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-1223697346]], but this node is not initialized yet 2026-02-08T02:21:50,991 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoinNack message from [Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/system/cluster/core/daemon#417066539]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2026-02-08T02:21:51,155 | WARN | features-3-thread-1 | DatastoreContext | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Cannot find policy , will stick with normal 2026-02-08T02:21:51,159 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Create data store instance of type : config 2026-02-08T02:21:51,160 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Config file exists - reading config from it 2026-02-08T02:21:51,160 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Config file exists - reading config from it 2026-02-08T02:21:51,165 | INFO | features-3-thread-1 | AbstractDataStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Creating ShardManager : shardmanager-config 2026-02-08T02:21:51,207 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Starting ShardManager shard-manager-config 2026-02-08T02:21:51,207 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Recovery complete 2026-02-08T02:21:51,216 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Data store config is using tell-based protocol 2026-02-08T02:21:51,220 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Config file exists - reading config from it 2026-02-08T02:21:51,221 | INFO | features-3-thread-1 | AbstractModuleShardConfigProvider | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Config file exists - reading config from it 2026-02-08T02:21:51,221 | INFO | features-3-thread-1 | OSGiDistributedDataStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Distributed Datastore type OPERATIONAL starting 2026-02-08T02:21:51,222 | WARN | features-3-thread-1 | DatastoreContext | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Cannot find policy , will stick with normal 2026-02-08T02:21:51,222 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Create data store instance of type : operational 2026-02-08T02:21:51,223 | INFO | features-3-thread-1 | AbstractDataStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Creating ShardManager : shardmanager-operational 2026-02-08T02:21:51,226 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Starting ShardManager shard-manager-operational 2026-02-08T02:21:51,228 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Recovery complete 2026-02-08T02:21:51,230 | INFO | features-3-thread-1 | DistributedDataStoreFactory | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Data store operational is using tell-based protocol 2026-02-08T02:21:51,231 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-eos-common-api/15.0.2 2026-02-08T02:21:51,235 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-eos-dom-api/15.0.2 2026-02-08T02:21:51,236 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.model.general-entity/15.0.2 2026-02-08T02:21:51,238 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding-dom-adapter/15.0.2 2026-02-08T02:21:51,257 | INFO | features-3-thread-1 | OSGiBlockingBindingNormalizer | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter activated 2026-02-08T02:21:51,260 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-topology-operational: Shard created, persistent : false 2026-02-08T02:21:51,267 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-default-operational: Shard created, persistent : false 2026-02-08T02:21:51,268 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-toaster-operational: Shard created, persistent : false 2026-02-08T02:21:51,280 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-inventory-operational: Shard created, persistent : false 2026-02-08T02:21:51,282 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-default-config: Shard created, persistent : true 2026-02-08T02:21:51,285 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-topology-config: Shard created, persistent : true 2026-02-08T02:21:51,286 | INFO | features-3-thread-1 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for MountPointService activated 2026-02-08T02:21:51,289 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-inventory-config: Shard created, persistent : true 2026-02-08T02:21:51,291 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-inventory-operational/member-1-shard-inventory-operational-notifier#625764338 created and ready for shard:member-1-shard-inventory-operational 2026-02-08T02:21:51,292 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-topology-operational/member-1-shard-topology-operational-notifier#847805324 created and ready for shard:member-1-shard-topology-operational 2026-02-08T02:21:51,292 | INFO | features-3-thread-1 | DOMNotificationRouter | 251 - org.opendaylight.mdsal.mdsal-dom-broker - 15.0.2 | DOM Notification Router started 2026-02-08T02:21:51,292 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-inventory-config/member-1-shard-inventory-config-notifier#-1675931116 created and ready for shard:member-1-shard-inventory-config 2026-02-08T02:21:51,293 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-default-operational/member-1-shard-default-operational-notifier#631318258 created and ready for shard:member-1-shard-default-operational 2026-02-08T02:21:51,293 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Starting recovery with journal batch size 1 2026-02-08T02:21:51,294 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Starting recovery with journal batch size 1 2026-02-08T02:21:51,294 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Starting recovery with journal batch size 1 2026-02-08T02:21:51,294 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Starting recovery with journal batch size 1 2026-02-08T02:21:51,296 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: Starting recovery with journal batch size 1 2026-02-08T02:21:51,297 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-default-config/member-1-shard-default-config-notifier#-85513600 created and ready for shard:member-1-shard-default-config 2026-02-08T02:21:51,297 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-topology-config/member-1-shard-topology-config-notifier#-473242771 created and ready for shard:member-1-shard-topology-config 2026-02-08T02:21:51,297 | INFO | features-3-thread-1 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for NotificationService activated 2026-02-08T02:21:51,297 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-toaster-operational/member-1-shard-toaster-operational-notifier#-1095228782 created and ready for shard:member-1-shard-toaster-operational 2026-02-08T02:21:51,299 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: Starting recovery with journal batch size 1 2026-02-08T02:21:51,297 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: Starting recovery with journal batch size 1 2026-02-08T02:21:51,300 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-toaster-config: Shard created, persistent : true 2026-02-08T02:21:51,301 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-toaster-config/member-1-shard-toaster-config-notifier#-106231952 created and ready for shard:member-1-shard-toaster-config 2026-02-08T02:21:51,301 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: Starting recovery with journal batch size 1 2026-02-08T02:21:51,310 | INFO | features-3-thread-1 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for NotificationPublishService activated 2026-02-08T02:21:51,315 | INFO | features-3-thread-1 | DOMRpcRouter | 251 - org.opendaylight.mdsal.mdsal-dom-broker - 15.0.2 | DOM RPC/Action router started 2026-02-08T02:21:51,320 | INFO | features-3-thread-1 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for RpcService activated 2026-02-08T02:21:51,321 | INFO | features-3-thread-1 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for RpcProviderService activated 2026-02-08T02:21:51,323 | INFO | features-3-thread-1 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for ActionService activated 2026-02-08T02:21:51,325 | INFO | features-3-thread-1 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for ActionProviderService activated 2026-02-08T02:21:51,325 | INFO | features-3-thread-1 | DynamicBindingAdapter | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | 8 DOMService trackers started 2026-02-08T02:21:51,329 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.eos-dom-akka/12.0.3 2026-02-08T02:21:51,381 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-singleton-impl/15.0.2 2026-02-08T02:21:51,401 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.codec-compression/4.2.7.Final 2026-02-08T02:21:51,402 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.codec-http/4.2.7.Final 2026-02-08T02:21:51,402 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.codec-http2/4.2.7.Final 2026-02-08T02:21:51,403 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.commons.commons-codec/1.19.0 2026-02-08T02:21:51,404 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-api/10.0.2 2026-02-08T02:21:51,405 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-tcp/10.0.2 2026-02-08T02:21:51,405 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-crypto/10.0.2 2026-02-08T02:21:51,406 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-tls/10.0.2 2026-02-08T02:21:51,407 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.iana-crypt-hash/1.0.2 2026-02-08T02:21:51,407 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-http/10.0.2 2026-02-08T02:21:51,408 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc7952/1.0.2 2026-02-08T02:21:51,409 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc8342-ietf-origin/1.0.2 2026-02-08T02:21:51,409 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc8526/1.0.2 2026-02-08T02:21:51,410 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.iana-ssh-public-key-algs/1.0.2 2026-02-08T02:21:51,410 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.iana-ssh-encryption-algs/1.0.2 2026-02-08T02:21:51,410 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.iana-ssh-key-exchange-algs/1.0.2 2026-02-08T02:21:51,411 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.iana-ssh-mac-algs/1.0.2 2026-02-08T02:21:51,411 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc9644-ietf-ssh-common/1.0.2 2026-02-08T02:21:51,411 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.mdsal-eos-binding-api/15.0.2 2026-02-08T02:21:51,411 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.common/0.21.2 2026-02-08T02:21:51,412 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.eos-binding-adapter/15.0.2 2026-02-08T02:21:51,413 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.impl/0.21.2 2026-02-08T02:21:51,445 | INFO | features-3-thread-1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.21.2 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationServiceFactory)] 2026-02-08T02:21:51,460 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.impl/0.21.2. Missing service: [org.opendaylight.openflowplugin.api.openflow.statistics.ofpspecific.MessageIntelligenceAgency] 2026-02-08T02:21:51,465 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: journal open: applyTo=0 2026-02-08T02:21:51,468 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: journal open: applyTo=0 2026-02-08T02:21:51,468 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: journal open: applyTo=0 2026-02-08T02:21:51,468 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: journal open: applyTo=0 2026-02-08T02:21:51,468 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: journal open: applyTo=0 2026-02-08T02:21:51,468 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: journal open: applyTo=0 2026-02-08T02:21:51,469 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: journal open: applyTo=0 2026-02-08T02:21:51,469 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: journal open: applyTo=0 2026-02-08T02:21:51,480 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.21.2 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2026-02-08T02:21:51,482 | INFO | features-3-thread-1 | MessageIntelligenceAgencyImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Registered MBean org.opendaylight.openflowplugin.impl.statistics.ofpspecific:type=MessageIntelligenceAgencyMXBean 2026-02-08T02:21:51,483 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.impl/0.21.2 2026-02-08T02:21:51,484 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.extension-onf/0.21.2 2026-02-08T02:21:51,489 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: Local TermInfo store seeded with TermInfo{term=0} 2026-02-08T02:21:51,490 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.odl-device-notification/10.0.2 2026-02-08T02:21:51,490 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: Local TermInfo store seeded with TermInfo{term=0} 2026-02-08T02:21:51,491 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.model.topology/0.21.2 2026-02-08T02:21:51,491 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc9644-ietf-ssh-client/1.0.2 2026-02-08T02:21:51,492 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc9644-ietf-ssh-server/1.0.2 2026-02-08T02:21:51,492 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.transport-ssh/10.0.2 2026-02-08T02:21:51,493 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Local TermInfo store seeded with TermInfo{term=0} 2026-02-08T02:21:51,493 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Local TermInfo store seeded with TermInfo{term=0} 2026-02-08T02:21:51,493 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Local TermInfo store seeded with TermInfo{term=0} 2026-02-08T02:21:51,493 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Local TermInfo store seeded with TermInfo{term=0} 2026-02-08T02:21:51,493 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc6470/1.0.2 2026-02-08T02:21:51,494 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: Local TermInfo store seeded with TermInfo{term=0} 2026-02-08T02:21:51,494 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.jolokia.osgi/1.7.2 2026-02-08T02:21:51,494 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: Local TermInfo store seeded with TermInfo{term=0} 2026-02-08T02:21:51,494 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Recovery completed in in 10.54 ms: last log index = -1, last log term = -1, napshot index = -1, snapshot term = -1, journal size = 0 2026-02-08T02:21:51,495 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: Recovery completed in in 13.54 ms: last log index = -1, last log term = -1, napshot index = -1, snapshot term = -1, journal size = 0 2026-02-08T02:21:51,495 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Recovery completed in in 8.545 ms: last log index = -1, last log term = -1, napshot index = -1, snapshot term = -1, journal size = 0 2026-02-08T02:21:51,495 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Recovery completed in in 12.39 ms: last log index = -1, last log term = -1, napshot index = -1, snapshot term = -1, journal size = 0 2026-02-08T02:21:51,498 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: Recovery completed in in 13.06 ms: last log index = -1, last log term = -1, napshot index = -1, snapshot term = -1, journal size = 0 2026-02-08T02:21:51,498 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Recovery completed in in 14.88 ms: last log index = -1, last log term = -1, napshot index = -1, snapshot term = -1, journal size = 0 2026-02-08T02:21:51,498 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: Recovery completed in in 18.72 ms: last log index = -1, last log term = -1, napshot index = -1, snapshot term = -1, journal size = 0 2026-02-08T02:21:51,501 | INFO | features-3-thread-1 | StoppableHttpServiceFactory | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.jolokia.osgi_1.7.2 [156]] 2026-02-08T02:21:51,501 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-operational , received role change from null to Follower 2026-02-08T02:21:51,502 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-operational , received role change from null to Follower 2026-02-08T02:21:51,502 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from null to Follower 2026-02-08T02:21:51,502 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-operational , received role change from null to Follower 2026-02-08T02:21:51,503 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from null to Follower 2026-02-08T02:21:51,504 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: Recovery completed in in 23.47 ms: last log index = -1, last log term = -1, napshot index = -1, snapshot term = -1, journal size = 0 2026-02-08T02:21:51,505 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-config , received role change from null to Follower 2026-02-08T02:21:51,505 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-config , received role change from null to Follower 2026-02-08T02:21:51,506 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from null to Follower 2026-02-08T02:21:51,509 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2026-02-08T02:21:51,509 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2026-02-08T02:21:51,510 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-toaster-operational from null to Follower 2026-02-08T02:21:51,510 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-inventory-config from null to Follower 2026-02-08T02:21:51,510 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2026-02-08T02:21:51,510 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2026-02-08T02:21:51,511 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-toaster-config from null to Follower 2026-02-08T02:21:51,511 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-default-config from null to Follower 2026-02-08T02:21:51,511 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2026-02-08T02:21:51,511 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2026-02-08T02:21:51,511 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-default-operational from null to Follower 2026-02-08T02:21:51,511 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-topology-config from null to Follower 2026-02-08T02:21:51,512 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2026-02-08T02:21:51,512 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-topology-operational from null to Follower 2026-02-08T02:21:51,512 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2026-02-08T02:21:51,517 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from null to Follower 2026-02-08T02:21:51,522 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@6c423035,contexts=[{HS,OCM-5,context:54833957,/}]} 2026-02-08T02:21:51,523 | INFO | paxweb-config-1-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@6c423035,contexts=null}", size=3} 2026-02-08T02:21:51,523 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{HS,id=OCM-5,name='context:54833957',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [156],contextId='context:54833957',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@344b325}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@14cebc80{/,null,STOPPED} 2026-02-08T02:21:51,524 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@14cebc80{/,null,STOPPED} 2026-02-08T02:21:51,524 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@6c423035,contexts=[{HS,OCM-5,context:54833957,/}]} 2026-02-08T02:21:51,528 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/" with default Osgi Context OsgiContextModel{HS,id=OCM-5,name='context:54833957',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [156],contextId='context:54833957',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@344b325}} 2026-02-08T02:21:51,544 | INFO | paxweb-config-1-thread-1 | osgi | 156 - org.jolokia.osgi - 1.7.2 | No access restrictor found, access to any MBean is allowed 2026-02-08T02:21:51,569 | INFO | paxweb-config-1-thread-1 | ContextHandler | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@14cebc80{/,null,AVAILABLE} 2026-02-08T02:21:51,569 | INFO | paxweb-config-1-thread-1 | OsgiServletContext | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{HS,id=OCM-5,name='context:54833957',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [156],contextId='context:54833957',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@344b325}}} as OSGi service for "/" context path 2026-02-08T02:21:51,573 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.srm-api/0.21.2 2026-02-08T02:21:51,574 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.srm-shell/0.21.2 2026-02-08T02:21:51,579 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.srm-shell/0.21.2. Missing service: [org.opendaylight.mdsal.binding.api.DataBroker, org.opendaylight.serviceutils.srm.spi.RegistryControl] 2026-02-08T02:21:51,579 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc7407-ietf-x509-cert-to-name/1.0.2 2026-02-08T02:21:51,580 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.model.draft-ietf-restconf-server/10.0.2 2026-02-08T02:21:51,580 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | javassist/3.30.2.GA 2026-02-08T02:21:51,581 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.bulk-o-matic/0.21.2 2026-02-08T02:21:51,582 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.rabbitmq.client/5.26.0 2026-02-08T02:21:51,583 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.graphite/4.2.37 2026-02-08T02:21:51,583 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.mdsal.binding-util/15.0.2 2026-02-08T02:21:51,584 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.apache.karaf.jdbc.core/4.4.8 2026-02-08T02:21:51,594 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.jdbc.core/4.4.8 2026-02-08T02:21:51,595 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.authn-api/0.22.3 2026-02-08T02:21:51,596 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.encrypt-service/0.22.3 2026-02-08T02:21:51,596 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.cert/0.22.3 2026-02-08T02:21:51,600 | INFO | features-3-thread-1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.22.3 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2026-02-08T02:21:51,600 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.tokenauthrealm/0.22.3 2026-02-08T02:21:51,602 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.web.servlet-api/0.22.3 2026-02-08T02:21:51,602 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.idm-store-h2/0.22.3 2026-02-08T02:21:51,603 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.web.osgi-impl/0.22.3 2026-02-08T02:21:51,605 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.containers.jersey-container-servlet-core/2.47.0 2026-02-08T02:21:51,605 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.web.servlet-jersey2/0.22.3 2026-02-08T02:21:51,608 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.shiro/0.22.3 2026-02-08T02:21:51,612 | INFO | features-3-thread-1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.aaa.api.IIDMStore), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService)] 2026-02-08T02:21:51,622 | INFO | features-3-thread-1 | StoppableHttpServiceFactory | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.opendaylight.aaa.shiro_0.22.3 [173]] 2026-02-08T02:21:51,624 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2026-02-08T02:21:51,624 | INFO | paxweb-config-1-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]}", size=1} 2026-02-08T02:21:51,624 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2026-02-08T02:21:51,625 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.encrypt-service-impl/0.22.3 2026-02-08T02:21:51,627 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.jetty-auth-log-filter/0.22.3 2026-02-08T02:21:51,628 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.reconciliation-framework/0.21.2 2026-02-08T02:21:51,629 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.21.2. Missing service: [org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager] 2026-02-08T02:21:51,631 | INFO | features-3-thread-1 | ReconciliationManagerImpl | 303 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.21.2 | ReconciliationManager started 2026-02-08T02:21:51,632 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.21.2 2026-02-08T02:21:51,632 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | karaf.branding/14.1.6 2026-02-08T02:21:51,632 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.ietf.model.rfc8525/1.0.2 2026-02-08T02:21:51,633 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.truststore-none/10.0.2 2026-02-08T02:21:51,633 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.libraries.liblldp/0.21.2 2026-02-08T02:21:51,635 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.21.2 2026-02-08T02:21:51,639 | INFO | features-3-thread-1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2026-02-08T02:21:51,640 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.model.rfc5277/10.0.2 2026-02-08T02:21:51,640 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-repo-fs/14.0.20 2026-02-08T02:21:51,640 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.jvm/4.2.37 2026-02-08T02:21:51,640 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.dropwizard.metrics.healthchecks/4.2.37 2026-02-08T02:21:51,641 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.device-ownership-service/0.21.2 2026-02-08T02:21:51,642 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.lldp-speaker/0.21.2 2026-02-08T02:21:51,646 | INFO | features-3-thread-1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.21.2 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2026-02-08T02:21:51,647 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | com.googlecode.json-simple/1.1.1 2026-02-08T02:21:51,647 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.topology-manager/0.21.2 2026-02-08T02:21:51,650 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-remoterpc-connector/12.0.3 2026-02-08T02:21:51,653 | INFO | features-3-thread-1 | OSGiRemoteOpsProvider | 196 - org.opendaylight.controller.sal-remoterpc-connector - 12.0.3 | Remote Operations service starting 2026-02-08T02:21:51,655 | INFO | features-3-thread-1 | OSGiRemoteOpsProvider | 196 - org.opendaylight.controller.sal-remoterpc-connector - 12.0.3 | Remote Operations service started 2026-02-08T02:21:51,656 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server/10.0.2 2026-02-08T02:21:51,658 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.model.sal-remote/10.0.2 2026-02-08T02:21:51,658 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.rfc8639-impl/10.0.2 2026-02-08T02:21:51,660 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.cds-dom-api/12.0.3 2026-02-08T02:21:51,661 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.keystore-none/10.0.2 2026-02-08T02:21:51,661 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.containers.jersey-container-servlet/2.47.0 2026-02-08T02:21:51,661 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.inject.jersey-hk2/2.47.0 2026-02-08T02:21:51,662 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.arbitratorreconciliation-impl/0.21.2 2026-02-08T02:21:51,663 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.of-switch-config-pusher/0.21.2 2026-02-08T02:21:51,664 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.yanglib-mdsal-writer/10.0.2 2026-02-08T02:21:51,665 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.yangtools.yang-data-transform/14.0.20 2026-02-08T02:21:51,665 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.openflowjava.blueprint-config/0.21.2 2026-02-08T02:21:51,687 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.infrautils.diagstatus-shell/7.1.9 2026-02-08T02:21:51,698 | INFO | features-3-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.infrautils.diagstatus-shell/7.1.9 2026-02-08T02:21:51,699 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-nb/10.0.2 2026-02-08T02:21:51,702 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.locator/2.6.1 2026-02-08T02:21:51,704 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.aaa.filterchain/0.22.3 2026-02-08T02:21:51,714 | INFO | features-3-thread-1 | CustomFilterAdapterConfigurationImpl | 167 - org.opendaylight.aaa.filterchain - 0.22.3 | Custom filter properties updated: {service.pid=org.opendaylight.aaa.filterchain, osgi.ds.satisfying.condition.target=(osgi.condition.id=true), customFilterList=, component.name=org.opendaylight.aaa.filterchain.configuration.impl.CustomFilterAdapterConfigurationImpl, felix.fileinstall.filename=file:/tmp/karaf-0.23.1-SNAPSHOT/etc/org.opendaylight.aaa.filterchain.cfg, component.id=127, Filter.target=(org.opendaylight.aaa.filterchain.filter=true)} 2026-02-08T02:21:51,714 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server-jaxrs/10.0.2 2026-02-08T02:21:51,717 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.sal-remote-impl/10.0.2 2026-02-08T02:21:51,720 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.restconf-server-mdsal/10.0.2 2026-02-08T02:21:51,725 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.utils/2.6.1 2026-02-08T02:21:51,727 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.hk2.api/2.6.1 2026-02-08T02:21:51,729 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.ws.rs-api/2.1.6 2026-02-08T02:21:51,729 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.core.jersey-server/2.47.0 2026-02-08T02:21:51,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | jakarta.activation-api/1.2.2 2026-02-08T02:21:51,730 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.glassfish.jersey.core.jersey-common/2.47.0 2026-02-08T02:21:51,731 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.common/4.2.7.Final 2026-02-08T02:21:51,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | io.netty.buffer/4.2.7.Final 2026-02-08T02:21:51,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.api/0.21.2 2026-02-08T02:21:51,732 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.srm-impl/0.21.2 2026-02-08T02:21:51,737 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 2026-02-08T02:21:51,741 | INFO | features-3-thread-1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2026-02-08T02:21:51,746 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2026-02-08T02:21:51,746 | INFO | features-3-thread-1 | OpenflowServiceRecoveryHandlerImpl | 300 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.21.2 | Registering openflowplugin service recovery handlers 2026-02-08T02:21:51,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.truststore-api/10.0.2 2026-02-08T02:21:51,747 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.openflowplugin.blueprint-config/0.21.2 2026-02-08T02:21:51,748 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.netconf.keystore-api/10.0.2 2026-02-08T02:21:51,749 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | org.opendaylight.controller.sal-cluster-admin-impl/12.0.3 2026-02-08T02:21:51,752 | INFO | features-3-thread-1 | Activator | 100 - org.apache.karaf.deployer.features - 4.4.8 | Deployment finished. Registering FeatureDeploymentListener 2026-02-08T02:21:52,089 | INFO | features-3-thread-1 | FeaturesServiceImpl | 18 - org.apache.karaf.features.core - 4.4.8 | Done. 2026-02-08T02:21:52,311 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#122377365]], but this node is not initialized yet 2026-02-08T02:21:52,328 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoinNack message from [Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/system/cluster/core/daemon#2031202064]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2026-02-08T02:21:52,339 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] is JOINING itself (with roles [member-1, dc-default], version [0.0.0]) and forming new cluster 2026-02-08T02:21:52,340 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - is the new leader among reachable nodes (more leaders may exist) 2026-02-08T02:21:52,345 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.170.226:2550] to [Up] 2026-02-08T02:21:52,352 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.226:2550 2026-02-08T02:21:52,353 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-default-config 2026-02-08T02:21:52,353 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-topology-config 2026-02-08T02:21:52,353 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-inventory-config 2026-02-08T02:21:52,353 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-toaster-config 2026-02-08T02:21:52,352 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.226:2550 2026-02-08T02:21:52,354 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-default-operational 2026-02-08T02:21:52,354 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-topology-operational 2026-02-08T02:21:52,355 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2026-02-08T02:21:52,355 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2026-02-08T02:21:52,360 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | This node is now the leader responsible for taking SBR decisions among the reachable nodes (more leaders may exist). 2026-02-08T02:21:52,365 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Singleton manager starting singleton actor [pekko://opendaylight-cluster-data/system/singletonManagerOwnerSupervisor/OwnerSupervisor] 2026-02-08T02:21:52,366 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | ClusterSingletonManager state change [Start -> Oldest] 2026-02-08T02:21:53,368 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | ClusterSingletonProxy | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Singleton identified at [pekko://opendaylight-cluster-data/system/singletonManagerOwnerSupervisor/OwnerSupervisor] 2026-02-08T02:22:01,554 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Candidate): Starting new election term 1 2026-02-08T02:22:01,555 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2026-02-08T02:22:01,555 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config (Candidate): Starting new election term 1 2026-02-08T02:22:01,555 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2026-02-08T02:22:01,556 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-operational , received role change from Follower to Candidate 2026-02-08T02:22:01,556 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-default-operational from Follower to Candidate 2026-02-08T02:22:01,556 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from Follower to Candidate 2026-02-08T02:22:01,556 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Candidate): Starting new election term 1 2026-02-08T02:22:01,556 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-inventory-config from Follower to Candidate 2026-02-08T02:22:01,556 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2026-02-08T02:22:01,557 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-config , received role change from Follower to Candidate 2026-02-08T02:22:01,557 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-default-config from Follower to Candidate 2026-02-08T02:22:01,561 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Candidate): Starting new election term 1 2026-02-08T02:22:01,561 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2026-02-08T02:22:01,562 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-operational , received role change from Follower to Candidate 2026-02-08T02:22:01,562 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-topology-operational from Follower to Candidate 2026-02-08T02:22:01,584 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational (Candidate): Starting new election term 1 2026-02-08T02:22:01,585 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2026-02-08T02:22:01,585 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-operational , received role change from Follower to Candidate 2026-02-08T02:22:01,585 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-toaster-operational from Follower to Candidate 2026-02-08T02:22:01,586 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Candidate): Starting new election term 1 2026-02-08T02:22:01,586 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Candidate): Starting new election term 1 2026-02-08T02:22:01,586 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2026-02-08T02:22:01,586 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2026-02-08T02:22:01,587 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from Follower to Candidate 2026-02-08T02:22:01,587 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from Follower to Candidate 2026-02-08T02:22:01,587 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-toaster-config from Follower to Candidate 2026-02-08T02:22:01,587 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from Follower to Candidate 2026-02-08T02:22:01,602 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Candidate): Starting new election term 1 2026-02-08T02:22:01,602 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Follower) :- Switching from behavior Follower to Candidate, election term: 1 2026-02-08T02:22:01,603 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-config , received role change from Follower to Candidate 2026-02-08T02:22:01,603 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-topology-config from Follower to Candidate 2026-02-08T02:22:03,021 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-1223697346]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2026-02-08T02:22:03,022 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.170.226:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-1223697346]] (version [1.2.1]) 2026-02-08T02:22:03,095 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Node [pekko://opendaylight-cluster-data@10.30.170.53:2550] is JOINING, roles [member-2, dc-default], version [0.0.0] 2026-02-08T02:22:03,097 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#2087164173] was unhandled. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:22:03,097 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#1633889788] was unhandled. [2] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:22:03,927 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.170.53:2550] to [Up] 2026-02-08T02:22:03,927 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:22:03,928 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational 2026-02-08T02:22:03,928 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational 2026-02-08T02:22:03,928 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2026-02-08T02:22:03,928 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2026-02-08T02:22:03,928 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Peer address for peer member-2-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational 2026-02-08T02:22:03,928 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Peer address for peer member-2-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2026-02-08T02:22:03,928 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: Peer address for peer member-2-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2026-02-08T02:22:03,927 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:22:03,928 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config 2026-02-08T02:22:03,928 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Peer address for peer member-2-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational 2026-02-08T02:22:03,929 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-topology-config 2026-02-08T02:22:03,929 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config 2026-02-08T02:22:03,929 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-toaster-config 2026-02-08T02:22:03,929 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: Peer address for peer member-2-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config 2026-02-08T02:22:03,929 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: Peer address for peer member-2-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-topology-config 2026-02-08T02:22:03,929 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: Peer address for peer member-2-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-toaster-config 2026-02-08T02:22:03,929 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Peer address for peer member-2-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config 2026-02-08T02:22:05,862 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#122377365]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2026-02-08T02:22:05,863 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.170.226:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#122377365]] (version [1.2.1]) 2026-02-08T02:22:05,909 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Node [pekko://opendaylight-cluster-data@10.30.171.87:2550] is JOINING, roles [member-3, dc-default], version [0.0.0] 2026-02-08T02:22:05,910 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#1633889788] was unhandled. [3] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:22:05,910 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#2087164173] was unhandled. [4] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:22:05,966 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.171.87:2550] to [Up] 2026-02-08T02:22:05,967 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.87:2550 2026-02-08T02:22:05,967 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-default-config 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-topology-config 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-inventory-config 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-toaster-config 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.87:2550 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-default-operational 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: Peer address for peer member-3-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-topology-config 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-topology-operational 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Peer address for peer member-3-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-inventory-config 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Peer address for peer member-3-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-default-operational 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Peer address for peer member-3-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-topology-operational 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Peer address for peer member-3-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: Peer address for peer member-3-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: Peer address for peer member-3-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-toaster-config 2026-02-08T02:22:05,968 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: Peer address for peer member-3-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-default-config 2026-02-08T02:22:11,328 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-inventory-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2026-02-08T02:22:11,341 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2026-02-08T02:22:11,342 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from Candidate to Follower 2026-02-08T02:22:11,343 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from Candidate to Follower 2026-02-08T02:22:11,360 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@5f66c916 2026-02-08T02:22:11,361 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-inventory-operational status sync done false 2026-02-08T02:22:11,362 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-default-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2026-02-08T02:22:11,376 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2026-02-08T02:22:11,377 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-config , received role change from Candidate to Follower 2026-02-08T02:22:11,377 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-default-config from Candidate to Follower 2026-02-08T02:22:11,382 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-topology-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2026-02-08T02:22:11,382 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@50dbb2ef 2026-02-08T02:22:11,383 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-topology-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2026-02-08T02:22:11,384 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-default-config status sync done false 2026-02-08T02:22:11,390 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-inventory-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2026-02-08T02:22:11,396 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2026-02-08T02:22:11,396 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-config , received role change from Candidate to Follower 2026-02-08T02:22:11,396 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-topology-config from Candidate to Follower 2026-02-08T02:22:11,398 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2026-02-08T02:22:11,398 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-operational , received role change from Candidate to Follower 2026-02-08T02:22:11,399 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-topology-operational from Candidate to Follower 2026-02-08T02:22:11,403 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-topology-config status sync done false 2026-02-08T02:22:11,404 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@369db1cd 2026-02-08T02:22:11,404 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@3ab31ace 2026-02-08T02:22:11,404 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-topology-operational status sync done false 2026-02-08T02:22:11,405 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2026-02-08T02:22:11,405 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from Candidate to Follower 2026-02-08T02:22:11,406 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-inventory-config from Candidate to Follower 2026-02-08T02:22:11,409 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@2f8d7409 2026-02-08T02:22:11,410 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done false 2026-02-08T02:22:11,421 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-default-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2026-02-08T02:22:11,421 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-toaster-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2026-02-08T02:22:11,430 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational (Candidate): Term 2 in "RequestVote{term=2, candidateId=member-2-shard-toaster-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 1 - switching to Follower 2026-02-08T02:22:11,439 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2026-02-08T02:22:11,440 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-operational , received role change from Candidate to Follower 2026-02-08T02:22:11,440 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-default-operational from Candidate to Follower 2026-02-08T02:22:11,441 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@40177a96 2026-02-08T02:22:11,442 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-default-operational status sync done false 2026-02-08T02:22:11,443 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2026-02-08T02:22:11,445 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from Candidate to Follower 2026-02-08T02:22:11,445 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-toaster-config from Candidate to Follower 2026-02-08T02:22:11,445 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@60972411 2026-02-08T02:22:11,446 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: All Shards are ready - data store config is ready 2026-02-08T02:22:11,446 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-toaster-config status sync done false 2026-02-08T02:22:11,448 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 2 2026-02-08T02:22:11,448 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-toaster-operational status sync done false 2026-02-08T02:22:11,448 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-operational , received role change from Candidate to Follower 2026-02-08T02:22:11,448 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-toaster-operational from Candidate to Follower 2026-02-08T02:22:11,448 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | OSGiDOMStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Datastore service type CONFIGURATION activated 2026-02-08T02:22:11,449 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@341f4f87 2026-02-08T02:22:11,449 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: All Shards are ready - data store operational is ready 2026-02-08T02:22:11,449 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | OSGiDistributedDataStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Distributed Datastore type CONFIGURATION started 2026-02-08T02:22:11,463 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | ConcurrentDOMDataBroker | 359 - org.opendaylight.yangtools.util - 14.0.20 | ThreadFactory created: CommitFutures 2026-02-08T02:22:11,465 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | DataBrokerCommitExecutor | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | DOM Data Broker commit exector started 2026-02-08T02:22:11,466 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | OSGiDOMStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Datastore service type OPERATIONAL activated 2026-02-08T02:22:11,468 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | ConcurrentDOMDataBroker | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | DOM Data Broker started 2026-02-08T02:22:11,472 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for DataBroker activated 2026-02-08T02:22:11,485 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | OSGiPasswordServiceConfigBootstrap | 171 - org.opendaylight.aaa.password-service-impl - 0.22.3 | Listening for password service configuration 2026-02-08T02:22:11,513 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService)] 2026-02-08T02:22:11,515 | ERROR | opendaylight-cluster-data-notification-dispatcher-41 | H2Store | 168 - org.opendaylight.aaa.idm-store-h2 - 0.22.3 | bundle org.opendaylight.aaa.idm-store-h2:0.22.3 (168)[org.opendaylight.aaa.datastore.h2.H2Store(96)] : Constructor argument 0 in class class org.opendaylight.aaa.datastore.h2.H2Store has unsupported type org.opendaylight.aaa.datastore.h2.ConnectionProvider 2026-02-08T02:22:11,520 | INFO | opendaylight-cluster-data-notification-dispatcher-41 | DefaultPasswordHashService | 171 - org.opendaylight.aaa.password-service-impl - 0.22.3 | DefaultPasswordHashService will utilize default iteration count=20000 2026-02-08T02:22:11,520 | INFO | opendaylight-cluster-data-notification-dispatcher-41 | DefaultPasswordHashService | 171 - org.opendaylight.aaa.password-service-impl - 0.22.3 | DefaultPasswordHashService will utilize default algorithm=SHA-512 2026-02-08T02:22:11,521 | INFO | opendaylight-cluster-data-notification-dispatcher-41 | DefaultPasswordHashService | 171 - org.opendaylight.aaa.password-service-impl - 0.22.3 | DefaultPasswordHashService will not utilize a private salt, since none was configured 2026-02-08T02:22:11,538 | INFO | opendaylight-cluster-data-notification-dispatcher-41 | H2Store | 168 - org.opendaylight.aaa.idm-store-h2 - 0.22.3 | H2 IDMStore activated 2026-02-08T02:22:11,541 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2026-02-08T02:22:11,544 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2026-02-08T02:22:11,564 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#793179678], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2026-02-08T02:22:11,567 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#793179678], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2026-02-08T02:22:11,570 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.22.3 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2026-02-08T02:22:11,588 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#793179678], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 21.92 ms 2026-02-08T02:22:11,606 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config ShiroConfiguration, Initial app config DatastoreConfig] 2026-02-08T02:22:11,640 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Loading properties from '(urn:opendaylight:params:xml:ns:yang:openflow:provider:config?revision=2016-05-10)openflow-provider-config' YANG file 2026-02-08T02:22:11,642 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | rpc-requests-quota configuration property was changed to '20000' 2026-02-08T02:22:11,642 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | global-notification-quota configuration property was changed to '64000' 2026-02-08T02:22:11,642 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | switch-features-mandatory configuration property was changed to 'false' 2026-02-08T02:22:11,642 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | enable-flow-removed-notification configuration property was changed to 'true' 2026-02-08T02:22:11,642 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-statistics-rpc-enabled configuration property was changed to 'false' 2026-02-08T02:22:11,643 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | barrier-count-limit configuration property was changed to '25600' 2026-02-08T02:22:11,643 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | barrier-interval-timeout-limit configuration property was changed to '500' 2026-02-08T02:22:11,643 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | echo-reply-timeout configuration property was changed to '2000' 2026-02-08T02:22:11,643 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-statistics-polling-on configuration property was changed to 'true' 2026-02-08T02:22:11,643 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-table-statistics-polling-on configuration property was changed to 'true' 2026-02-08T02:22:11,643 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-flow-statistics-polling-on configuration property was changed to 'true' 2026-02-08T02:22:11,643 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-group-statistics-polling-on configuration property was changed to 'true' 2026-02-08T02:22:11,643 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-meter-statistics-polling-on configuration property was changed to 'true' 2026-02-08T02:22:11,643 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-port-statistics-polling-on configuration property was changed to 'true' 2026-02-08T02:22:11,643 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-queue-statistics-polling-on configuration property was changed to 'true' 2026-02-08T02:22:11,643 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | skip-table-features configuration property was changed to 'true' 2026-02-08T02:22:11,643 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | basic-timer-delay configuration property was changed to '3000' 2026-02-08T02:22:11,643 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), Initial app config TopologyLldpDiscoveryConfig] 2026-02-08T02:22:11,644 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | maximum-timer-delay configuration property was changed to '900000' 2026-02-08T02:22:11,644 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | use-single-layer-serialization configuration property was changed to 'true' 2026-02-08T02:22:11,644 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | thread-pool-min-threads configuration property was changed to '1' 2026-02-08T02:22:11,644 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | thread-pool-max-threads configuration property was changed to '32000' 2026-02-08T02:22:11,644 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | thread-pool-timeout configuration property was changed to '60' 2026-02-08T02:22:11,644 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | device-connection-rate-limit-per-min configuration property was changed to '0' 2026-02-08T02:22:11,644 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | device-connection-hold-time-in-seconds configuration property was changed to '0' 2026-02-08T02:22:11,644 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | device-datastore-removal-delay configuration property was changed to '500' 2026-02-08T02:22:11,644 | INFO | Blueprint Extender: 2 | OSGiConfigurationServiceFactory | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Loading configuration from 'org.opendaylight.openflowplugin' configuration file 2026-02-08T02:22:11,646 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | felix.fileinstall.filename configuration property was changed to 'file:/tmp/karaf-0.23.1-SNAPSHOT/etc/org.opendaylight.openflowplugin.cfg' 2026-02-08T02:22:11,646 | INFO | Blueprint Extender: 2 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | service.pid configuration property was changed to 'org.opendaylight.openflowplugin' 2026-02-08T02:22:11,648 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.21.2 is waiting for dependencies [Initial app config TopologyLldpDiscoveryConfig] 2026-02-08T02:22:11,648 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2026-02-08T02:22:11,648 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.21.2 has been started 2026-02-08T02:22:11,648 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.opendaylight.openflowplugin.impl_0.21.2 [310] was successfully created 2026-02-08T02:22:11,651 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.21.2 is waiting for dependencies [Initial app config LldpSpeakerConfig] 2026-02-08T02:22:11,658 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | EOSClusterSingletonServiceProvider | 258 - org.opendaylight.mdsal.mdsal-singleton-impl - 15.0.2 | Cluster Singleton Service started 2026-02-08T02:22:11,671 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | FlowCapableTopologyProvider | 305 - org.opendaylight.openflowplugin.applications.topology-manager - 0.21.2 | Topology Manager service started. 2026-02-08T02:22:11,674 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2026-02-08T02:22:11,680 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config DatastoreConfig] 2026-02-08T02:22:11,682 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager)] 2026-02-08T02:22:11,795 | INFO | Blueprint Extender: 3 | LLDPSpeaker | 301 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.21.2 | LLDPSpeaker started, it will send LLDP frames each 5 seconds 2026-02-08T02:22:11,827 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-default-config status sync done true 2026-02-08T02:22:11,839 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | org.opendaylight.openflowplugin.applications.topology.lldp.LLDPLinkAger@36da9328 was registered as configuration listener to OpenFlowPlugin configuration service 2026-02-08T02:22:11,867 | INFO | Blueprint Extender: 3 | NodeConnectorInventoryEventTranslator | 301 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.21.2 | NodeConnectorInventoryEventTranslator has started. 2026-02-08T02:22:11,868 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.21.2 has been started 2026-02-08T02:22:11,804 | WARN | CommitFutures-0 | OSGiEncryptionServiceConfigurator | 166 - org.opendaylight.aaa.encrypt-service-impl - 0.22.3 | Configuration update failed, attempting to continue org.opendaylight.mdsal.common.api.OptimisticLockFailedException: Optimistic lock failed for path /(config:aaa:authn:encrypt:service:config?revision=2024-02-02)aaa-encrypt-service-config at org.opendaylight.controller.cluster.datastore.ShardDataTree.canCommitEntry(ShardDataTree.java:779) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.ShardDataTree.processNextPendingTransaction(ShardDataTree.java:758) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.ShardDataTree.startCanCommit(ShardDataTree.java:847) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.CommitCohort.canCommit(CommitCohort.java:125) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendReadWriteTransaction.directCommit(FrontendReadWriteTransaction.java:558) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendReadWriteTransaction.handleModifyTransaction(FrontendReadWriteTransaction.java:697) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendReadWriteTransaction.doHandleRequest(FrontendReadWriteTransaction.java:360) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.FrontendTransaction.handleRequest(FrontendTransaction.java:135) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.AbstractFrontendHistory.handleTransactionRequest(AbstractFrontendHistory.java:122) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.LeaderFrontendState$Enabled.handleTransactionRequest(LeaderFrontendState.java:133) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequest(Shard.java:520) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleRequestEnvelope(Shard.java:350) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.Shard.handleNonRaftCommand(Shard.java:307) ~[bundleFile:?] at org.opendaylight.controller.cluster.raft.RaftActor.handleCommandImpl(RaftActor.java:391) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.raft.RaftActor.handleReceive(RaftActor.java:336) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) ~[?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) ~[?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) ~[?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) ~[?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) ~[?:?] Caused by: org.opendaylight.yangtools.yang.data.tree.api.ConflictingModificationAppliedException: Node was created by other transaction. at org.opendaylight.yangtools.yang.data.tree.impl.SchemaAwareApplyOperation.checkConflicting(SchemaAwareApplyOperation.java:69) ~[bundleFile:14.0.20] at org.opendaylight.yangtools.yang.data.tree.impl.SchemaAwareApplyOperation.checkWriteApplicable(SchemaAwareApplyOperation.java:172) ~[bundleFile:14.0.20] at org.opendaylight.yangtools.yang.data.tree.impl.SchemaAwareApplyOperation.checkApplicable(SchemaAwareApplyOperation.java:102) ~[bundleFile:14.0.20] at org.opendaylight.yangtools.yang.data.tree.impl.AbstractNodeContainerModificationStrategy.checkChildPreconditions(AbstractNodeContainerModificationStrategy.java:441) ~[bundleFile:14.0.20] at org.opendaylight.yangtools.yang.data.tree.impl.AbstractNodeContainerModificationStrategy.checkTouchApplicable(AbstractNodeContainerModificationStrategy.java:400) ~[bundleFile:14.0.20] at org.opendaylight.yangtools.yang.data.tree.impl.SchemaAwareApplyOperation.checkApplicable(SchemaAwareApplyOperation.java:101) ~[bundleFile:14.0.20] at org.opendaylight.yangtools.yang.data.tree.impl.InMemoryDataTreeModification.validate(InMemoryDataTreeModification.java:615) ~[bundleFile:14.0.20] at org.opendaylight.yangtools.yang.data.tree.impl.InMemoryDataTreeModification.lockedValidate(InMemoryDataTreeModification.java:625) ~[bundleFile:14.0.20] at org.opendaylight.yangtools.yang.data.tree.impl.InMemoryDataTreeModification.validate(InMemoryDataTreeModification.java:603) ~[bundleFile:14.0.20] at org.opendaylight.yangtools.yang.data.tree.impl.AbstractDataTreeTip.validate(AbstractDataTreeTip.java:33) ~[bundleFile:14.0.20] at org.opendaylight.controller.cluster.datastore.ShardDataTree.canCommitEntry(ShardDataTree.java:772) ~[bundleFile:?] ... 42 more 2026-02-08T02:22:11,875 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.lldp-speaker_0.21.2 [301] was successfully created 2026-02-08T02:22:11,886 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#-976788989], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2026-02-08T02:22:11,886 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#-976788989], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2026-02-08T02:22:11,886 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-default-operational status sync done true 2026-02-08T02:22:11,887 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#-976788989], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 1.335 ms 2026-02-08T02:22:11,897 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | ArbitratorReconciliationManagerImpl | 297 - org.opendaylight.openflowplugin.applications.arbitratorreconciliation-impl - 0.21.2 | ArbitratorReconciliationManager has started successfully. 2026-02-08T02:22:11,904 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | DeviceOwnershipService started 2026-02-08T02:22:11,905 | INFO | Blueprint Extender: 1 | LLDPActivator | 304 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.21.2 | Starting LLDPActivator with lldpSecureKey: aa9251f8-c7c0-4322-b8d6-c3a84593bda3 2026-02-08T02:22:11,909 | INFO | Blueprint Extender: 1 | LLDPActivator | 304 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.21.2 | LLDPDiscoveryListener started. 2026-02-08T02:22:11,911 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.21.2 has been started 2026-02-08T02:22:11,911 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery_0.21.2 [304] was successfully created 2026-02-08T02:22:11,912 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | DefaultConfigPusher | 302 - org.opendaylight.openflowplugin.applications.of-switch-config-pusher - 0.21.2 | DefaultConfigPusher has started. 2026-02-08T02:22:11,913 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#-1625142395], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent} 2026-02-08T02:22:11,913 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#-1625142395], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} 2026-02-08T02:22:11,913 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#-1625142395], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} in 429.9 μs 2026-02-08T02:22:11,915 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-topology-operational status sync done true 2026-02-08T02:22:11,915 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-topology-config status sync done true 2026-02-08T02:22:11,920 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | YangLibraryWriterSingleton | 292 - org.opendaylight.netconf.yanglib-mdsal-writer - 10.0.2 | ietf-yang-library writer registered 2026-02-08T02:22:11,925 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done true 2026-02-08T02:22:11,945 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | OSGiSwitchConnectionProviders | 317 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.21.2 | MD-SAL configuration-based SwitchConnectionProviders started 2026-02-08T02:22:11,972 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-toaster-operational status sync done true 2026-02-08T02:22:11,972 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-toaster-config status sync done true 2026-02-08T02:22:11,970 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.srm-shell/0.21.2 2026-02-08T02:22:11,990 | INFO | opendaylight-cluster-data-notification-dispatcher-41 | AAAEncryptionServiceImpl | 166 - org.opendaylight.aaa.encrypt-service-impl - 0.22.3 | AAAEncryptionService activated 2026-02-08T02:22:11,991 | INFO | opendaylight-cluster-data-notification-dispatcher-41 | OSGiEncryptionServiceConfigurator | 166 - org.opendaylight.aaa.encrypt-service-impl - 0.22.3 | Encryption Service enabled 2026-02-08T02:22:12,015 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 is waiting for dependencies [Initial app config ForwardingRulesManagerConfig] 2026-02-08T02:22:12,040 | INFO | Blueprint Extender: 2 | LazyBindingList | 326 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.20 | Using lazy population for lists larger than 16 element(s) 2026-02-08T02:22:12,044 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | FlowCapableTopologyProvider | 305 - org.opendaylight.openflowplugin.applications.topology-manager - 0.21.2 | Topology node flow:1 is successfully written to the operational datastore. 2026-02-08T02:22:12,049 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | OSGiFactorySwitchConnectionConfiguration | 317 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.21.2 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] 2026-02-08T02:22:12,060 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | OSGiFactorySwitchConnectionConfiguration | 317 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.21.2 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] 2026-02-08T02:22:12,077 | INFO | CommitFutures-1 | OSGiFactorySwitchConnectionConfiguration | 317 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.21.2 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] populated 2026-02-08T02:22:12,087 | INFO | Blueprint Extender: 2 | AaaCertMdsalProvider | 164 - org.opendaylight.aaa.cert - 0.22.3 | AaaCertMdsalProvider Initialized 2026-02-08T02:22:12,088 | INFO | CommitFutures-0 | OSGiFactorySwitchConnectionConfiguration | 317 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.21.2 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] populated 2026-02-08T02:22:12,084 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | org.opendaylight.openflowplugin.applications.frm.impl.ForwardingRulesManagerImpl@7bf87e20 was registered as configuration listener to OpenFlowPlugin configuration service 2026-02-08T02:22:12,093 | INFO | opendaylight-cluster-data-notification-dispatcher-44 | OSGiSwitchConnectionProviders | 317 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.21.2 | Starting instance of type 'openflow-switch-connection-provider-default-impl' 2026-02-08T02:22:12,146 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | OSGiClusterAdmin | 192 - org.opendaylight.controller.sal-cluster-admin-impl - 12.0.3 | Cluster Admin services started 2026-02-08T02:22:12,149 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | OSGiDistributedDataStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Distributed Datastore type OPERATIONAL started 2026-02-08T02:22:12,313 | INFO | Blueprint Extender: 1 | ForwardingRulesManagerImpl | 300 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.21.2 | ForwardingRulesManager has started successfully. 2026-02-08T02:22:12,322 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 has been started 2026-02-08T02:22:12,325 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager_0.21.2 [300] was successfully created 2026-02-08T02:22:12,386 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-inventory-operational status sync done true 2026-02-08T02:22:12,387 | INFO | Blueprint Extender: 2 | ODLKeyTool | 164 - org.opendaylight.aaa.cert - 0.22.3 | ctl.jks is created 2026-02-08T02:22:12,435 | INFO | Blueprint Extender: 2 | CertificateManagerService | 164 - org.opendaylight.aaa.cert - 0.22.3 | Certificate Manager service has been initialized 2026-02-08T02:22:12,446 | INFO | Blueprint Extender: 2 | CertificateManagerService | 164 - org.opendaylight.aaa.cert - 0.22.3 | AaaCert Rpc Service has been initialized 2026-02-08T02:22:12,449 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.22.3 has been started 2026-02-08T02:22:12,449 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.opendaylight.aaa.cert_0.22.3 [164] was successfully created 2026-02-08T02:22:12,498 | INFO | Blueprint Extender: 3 | StoreBuilder | 163 - org.opendaylight.aaa.authn-api - 0.22.3 | Checking if default entries must be created in IDM store 2026-02-08T02:22:12,628 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolved shard 2 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1161699537], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent} 2026-02-08T02:22:12,629 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1161699537], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2026-02-08T02:22:12,629 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1161699537], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} in 720.6 μs 2026-02-08T02:22:12,675 | INFO | opendaylight-cluster-data-notification-dispatcher-44 | OpenFlowPluginProviderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | OpenFlowPluginProvider started, waiting for onSystemBootReady() 2026-02-08T02:22:12,675 | INFO | opendaylight-cluster-data-notification-dispatcher-44 | OpenFlowPluginProviderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@1ca14385 2026-02-08T02:22:12,684 | INFO | opendaylight-cluster-data-notification-dispatcher-44 | OnfExtensionProvider | 309 - org.opendaylight.openflowplugin.extension-onf - 0.21.2 | ONF Extension Provider started. 2026-02-08T02:22:12,685 | INFO | opendaylight-cluster-data-notification-dispatcher-44 | OSGiSwitchConnectionProviders | 317 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.21.2 | Starting instance of type 'openflow-switch-connection-provider-legacy-impl' 2026-02-08T02:22:12,692 | INFO | opendaylight-cluster-data-notification-dispatcher-44 | OpenFlowPluginProviderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@1349e784 2026-02-08T02:22:12,692 | INFO | Blueprint Extender: 3 | AbstractStore | 168 - org.opendaylight.aaa.idm-store-h2 - 0.22.3 | Table AAA_DOMAINS does not exist, creating it 2026-02-08T02:22:12,731 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2026-02-08T02:22:12,732 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:22:12,775 | INFO | Blueprint Extender: 3 | StoreBuilder | 163 - org.opendaylight.aaa.authn-api - 0.22.3 | Created default domain 2026-02-08T02:22:12,781 | INFO | Blueprint Extender: 3 | AbstractStore | 168 - org.opendaylight.aaa.idm-store-h2 - 0.22.3 | Table AAA_ROLES does not exist, creating it 2026-02-08T02:22:12,847 | INFO | Blueprint Extender: 3 | StoreBuilder | 163 - org.opendaylight.aaa.authn-api - 0.22.3 | Created 'admin' role 2026-02-08T02:22:12,876 | INFO | Blueprint Extender: 3 | StoreBuilder | 163 - org.opendaylight.aaa.authn-api - 0.22.3 | Created 'user' role 2026-02-08T02:22:13,013 | INFO | Blueprint Extender: 3 | AbstractStore | 168 - org.opendaylight.aaa.idm-store-h2 - 0.22.3 | Table AAA_USERS does not exist, creating it 2026-02-08T02:22:13,023 | INFO | Blueprint Extender: 3 | AbstractStore | 168 - org.opendaylight.aaa.idm-store-h2 - 0.22.3 | Table AAA_GRANTS does not exist, creating it 2026-02-08T02:22:13,112 | INFO | Blueprint Extender: 3 | AAAShiroProvider | 173 - org.opendaylight.aaa.shiro - 0.22.3 | AAAShiroProvider Session Initiated 2026-02-08T02:22:13,216 | INFO | Blueprint Extender: 3 | IniSecurityManagerFactory | 172 - org.opendaylight.aaa.repackaged-shiro - 0.22.3 | Realms have been explicitly set on the SecurityManager instance - auto-setting of realms will not occur. 2026-02-08T02:22:13,247 | INFO | paxweb-config-1-thread-1 | ServerModel | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-11,contextPath='/auth'} 2026-02-08T02:22:13,247 | INFO | paxweb-config-1-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=318, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=173, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}", size=2} 2026-02-08T02:22:13,247 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-11,contextPath='/auth'} 2026-02-08T02:22:13,248 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=318, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=173, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@58e889ee{/auth,null,STOPPED} 2026-02-08T02:22:13,249 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@58e889ee{/auth,null,STOPPED} 2026-02-08T02:22:13,251 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2026-02-08T02:22:13,252 | INFO | paxweb-config-1-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=2} 2026-02-08T02:22:13,252 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2026-02-08T02:22:13,253 | INFO | Blueprint Extender: 3 | WhiteboardWebServer | 177 - org.opendaylight.aaa.web.osgi-impl - 0.22.3 | Bundle org.opendaylight.aaa.shiro_0.22.3 [173] registered context path /auth with 4 service(s) 2026-02-08T02:22:13,254 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/auth" with default Osgi Context OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=318, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=173, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} 2026-02-08T02:22:13,256 | INFO | paxweb-config-1-thread-1 | CustomFilterAdapter | 167 - org.opendaylight.aaa.filterchain - 0.22.3 | Initializing CustomFilterAdapter 2026-02-08T02:22:13,257 | INFO | paxweb-config-1-thread-1 | CustomFilterAdapter | 167 - org.opendaylight.aaa.filterchain - 0.22.3 | Injecting a new filter chain with 0 Filters: 2026-02-08T02:22:13,257 | INFO | paxweb-config-1-thread-1 | ContextHandler | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@58e889ee{/auth,null,AVAILABLE} 2026-02-08T02:22:13,257 | INFO | paxweb-config-1-thread-1 | OsgiServletContext | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=318, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=173, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}} as OSGi service for "/auth" context path 2026-02-08T02:22:13,261 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2026-02-08T02:22:13,262 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-13,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2026-02-08T02:22:13,262 | INFO | paxweb-config-1-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-13,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=2} 2026-02-08T02:22:13,262 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2026-02-08T02:22:13,263 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2026-02-08T02:22:13,263 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2026-02-08T02:22:13,263 | INFO | paxweb-config-1-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=1} 2026-02-08T02:22:13,263 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2026-02-08T02:22:13,286 | ERROR | Blueprint Extender: 3 | MdsalRestconfServer | 279 - org.opendaylight.netconf.restconf-server-mdsal - 10.0.2 | bundle org.opendaylight.netconf.restconf-server-mdsal:10.0.2 (279)[org.opendaylight.restconf.server.mdsal.MdsalRestconfServer(135)] : Constructor argument 5 in class class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer has unsupported type [Lorg.opendaylight.restconf.server.spi.RpcImplementation; 2026-02-08T02:22:13,373 | INFO | Blueprint Extender: 3 | StoppableHttpServiceFactory | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.opendaylight.netconf.restconf-server-jaxrs_10.0.2 [278]] 2026-02-08T02:22:13,374 | INFO | paxweb-config-1-thread-1 | ServerModel | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-18,contextPath='/rests'} 2026-02-08T02:22:13,374 | INFO | paxweb-config-1-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=330, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}", size=2} 2026-02-08T02:22:13,374 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-18,contextPath='/rests'} 2026-02-08T02:22:13,375 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=330, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@67bc5244{/rests,null,STOPPED} 2026-02-08T02:22:13,376 | INFO | Blueprint Extender: 3 | WhiteboardWebServer | 177 - org.opendaylight.aaa.web.osgi-impl - 0.22.3 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_10.0.2 [278] registered context path /rests with 4 service(s) 2026-02-08T02:22:13,378 | INFO | Blueprint Extender: 3 | WhiteboardWebServer | 177 - org.opendaylight.aaa.web.osgi-impl - 0.22.3 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_10.0.2 [278] registered context path /.well-known with 3 service(s) 2026-02-08T02:22:13,379 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@67bc5244{/rests,null,STOPPED} 2026-02-08T02:22:13,379 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-19,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2026-02-08T02:22:13,380 | INFO | Blueprint Extender: 3 | YangLibraryWriterSingleton | 292 - org.opendaylight.netconf.yanglib-mdsal-writer - 10.0.2 | Binding URL provider org.opendaylight.restconf.server.jaxrs.JaxRsYangLibrary@6911b21c 2026-02-08T02:22:13,380 | INFO | paxweb-config-1-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-19,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=2} 2026-02-08T02:22:13,380 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2026-02-08T02:22:13,380 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2026-02-08T02:22:13,380 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/rests" with default Osgi Context OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=330, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} 2026-02-08T02:22:13,381 | INFO | paxweb-config-1-thread-1 | CustomFilterAdapter | 167 - org.opendaylight.aaa.filterchain - 0.22.3 | Initializing CustomFilterAdapter 2026-02-08T02:22:13,381 | INFO | paxweb-config-1-thread-1 | CustomFilterAdapter | 167 - org.opendaylight.aaa.filterchain - 0.22.3 | Injecting a new filter chain with 0 Filters: 2026-02-08T02:22:13,381 | INFO | paxweb-config-1-thread-1 | ContextHandler | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@67bc5244{/rests,null,AVAILABLE} 2026-02-08T02:22:13,381 | INFO | paxweb-config-1-thread-1 | OsgiServletContext | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=330, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}} as OSGi service for "/rests" context path 2026-02-08T02:22:13,381 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2026-02-08T02:22:13,382 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-20,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2026-02-08T02:22:13,382 | INFO | paxweb-config-1-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-20,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=2} 2026-02-08T02:22:13,382 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2026-02-08T02:22:13,382 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2026-02-08T02:22:13,382 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2026-02-08T02:22:13,382 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2026-02-08T02:22:13,382 | INFO | paxweb-config-1-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=1} 2026-02-08T02:22:13,382 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2026-02-08T02:22:13,382 | INFO | paxweb-config-1-thread-1 | ServerModel | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-27,contextPath='/.well-known'} 2026-02-08T02:22:13,382 | INFO | paxweb-config-1-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-22,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=334, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}", size=2} 2026-02-08T02:22:13,382 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-27,contextPath='/.well-known'} 2026-02-08T02:22:13,383 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-22,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=334, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@a56dfb8{/.well-known,null,STOPPED} 2026-02-08T02:22:13,383 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@a56dfb8{/.well-known,null,STOPPED} 2026-02-08T02:22:13,384 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-24,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]} 2026-02-08T02:22:13,384 | INFO | paxweb-config-1-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-24,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]}", size=2} 2026-02-08T02:22:13,384 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2026-02-08T02:22:13,384 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2026-02-08T02:22:13,384 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /.well-known 2026-02-08T02:22:13,384 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/.well-known" with default Osgi Context OsgiContextModel{WB,id=OCM-22,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=334, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} 2026-02-08T02:22:13,384 | INFO | paxweb-config-1-thread-1 | ContextHandler | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@a56dfb8{/.well-known,null,AVAILABLE} 2026-02-08T02:22:13,384 | INFO | paxweb-config-1-thread-1 | OsgiServletContext | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-22,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=334, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}} as OSGi service for "/.well-known" context path 2026-02-08T02:22:13,385 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2026-02-08T02:22:13,385 | INFO | paxweb-config-1-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-25,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]} 2026-02-08T02:22:13,385 | INFO | paxweb-config-1-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-25,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]}", size=1} 2026-02-08T02:22:13,385 | INFO | paxweb-config-1-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-25,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]} 2026-02-08T02:22:13,408 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4AddressNoZone 2026-02-08T02:22:13,408 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4Prefix 2026-02-08T02:22:13,409 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6AddressNoZone 2026-02-08T02:22:13,409 | INFO | Blueprint Extender: 3 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6Prefix 2026-02-08T02:22:13,444 | INFO | Blueprint Extender: 3 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 10.0.2 | Initialized with service class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer 2026-02-08T02:22:13,444 | INFO | Blueprint Extender: 3 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 10.0.2 | Initialized with base path: /restconf, default encoding: JSON, default pretty print: false 2026-02-08T02:22:13,493 | INFO | Blueprint Extender: 3 | OSGiNorthbound | 275 - org.opendaylight.netconf.restconf-nb - 10.0.2 | Global RESTCONF northbound pools started 2026-02-08T02:22:13,495 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 has been started 2026-02-08T02:22:13,495 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.opendaylight.aaa.shiro_0.22.3 [173] was successfully created 2026-02-08T02:22:13,528 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2026-02-08T02:22:13,566 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2026-02-08T02:22:14,086 | INFO | SystemReadyService-0 | KarafSystemReady | 242 - org.opendaylight.infrautils.ready-impl - 7.1.9 | checkBundleDiagInfos: Elapsed time 24s, remaining time 275s, diag: Active {INSTALLED=0, RESOLVED=10, UNKNOWN=0, GRACE_PERIOD=0, WAITING=0, STARTING=0, ACTIVE=396, STOPPING=0, FAILURE=0} 2026-02-08T02:22:14,086 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 241 - org.opendaylight.infrautils.ready-api - 7.1.9 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 2026-02-08T02:22:14,086 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 241 - org.opendaylight.infrautils.ready-api - 7.1.9 | Now notifying all its registered SystemReadyListeners... 2026-02-08T02:22:14,086 | INFO | SystemReadyService-0 | OpenFlowPluginProviderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | onSystemBootReady() received, starting the switch connections 2026-02-08T02:22:14,198 | INFO | multiThreadIoEventLoopGroup-4-1 | TcpServerFacade | 319 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.21.2 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6633 2026-02-08T02:22:14,198 | INFO | multiThreadIoEventLoopGroup-2-1 | TcpServerFacade | 319 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.21.2 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6653 2026-02-08T02:22:14,198 | INFO | multiThreadIoEventLoopGroup-4-1 | SwitchConnectionProviderImpl | 319 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.21.2 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6633 2026-02-08T02:22:14,198 | INFO | multiThreadIoEventLoopGroup-2-1 | SwitchConnectionProviderImpl | 319 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.21.2 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6653 2026-02-08T02:22:14,199 | INFO | multiThreadIoEventLoopGroup-4-1 | OpenFlowPluginProviderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@1349e784 started 2026-02-08T02:22:14,199 | INFO | multiThreadIoEventLoopGroup-2-1 | OpenFlowPluginProviderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@1ca14385 started 2026-02-08T02:22:14,199 | INFO | multiThreadIoEventLoopGroup-2-1 | OpenFlowPluginProviderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | All switchConnectionProviders are up and running (2). 2026-02-08T02:25:00,222 | INFO | sshd-SshServer[613547ef](port=8101)-nio2-thread-1 | OpenSSHKeyPairProvider | 122 - org.apache.karaf.shell.ssh - 4.4.8 | Creating ssh server private key at /tmp/karaf-0.23.1-SNAPSHOT/etc/host.key 2026-02-08T02:25:00,225 | INFO | sshd-SshServer[613547ef](port=8101)-nio2-thread-1 | OpenSSHKeyPairGenerator | 122 - org.apache.karaf.shell.ssh - 4.4.8 | generateKeyPair(RSA) generating host key - size=2048 2026-02-08T02:25:00,901 | INFO | sshd-SshServer[613547ef](port=8101)-nio2-thread-2 | ServerSessionImpl | 126 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.170.170:54344 authenticated 2026-02-08T02:25:03,558 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/010__Cluster_Reconcilliation_Multi_DPN.robot" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/010__Cluster_Reconcilliation_Multi_DPN.robot 2026-02-08T02:25:04,236 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Check Shards Status And Initialize Variables" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Check Shards Status And Initialize Variables 2026-02-08T02:25:05,040 | INFO | qtp1021661463-508 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 10.0.2 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2026-02-08T02:25:05,044 | INFO | qtp1021661463-508 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 10.0.2 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2026-02-08T02:25:05,822 | INFO | qtp1021661463-508 | AuthenticationManager | 175 - org.opendaylight.aaa.tokenauthrealm - 0.22.3 | Authentication is now enabled 2026-02-08T02:25:05,823 | INFO | qtp1021661463-508 | AuthenticationManager | 175 - org.opendaylight.aaa.tokenauthrealm - 0.22.3 | Authentication Manager activated 2026-02-08T02:25:05,875 | INFO | qtp1021661463-508 | ApiPathParser | 273 - org.opendaylight.netconf.restconf-api - 10.0.2 | Consecutive slashes in REST URLs will be rejected 2026-02-08T02:25:13,035 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Inventory Follower and Leader Before Cluster Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Inventory Follower and Leader Before Cluster Restart 2026-02-08T02:25:14,232 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node1" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node1 2026-02-08T02:25:17,278 | INFO | multiThreadIoEventLoopGroup-5-1 | SystemNotificationsListenerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Connection closed by device, Device:/10.30.170.67:43044, NodeId:null 2026-02-08T02:25:17,342 | INFO | multiThreadIoEventLoopGroup-5-2 | ConnectionAdapterImpl | 319 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.21.2 | Hello received 2026-02-08T02:25:17,463 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Add Bulk Flow From Follower" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Add Bulk Flow From Follower 2026-02-08T02:25:17,750 | INFO | qtp1021661463-506 | StaticConfiguration | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding-over-DOM codec shortcuts are enabled 2026-02-08T02:25:17,760 | INFO | qtp1021661463-506 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Ping Pong Flow Tester Impl 2026-02-08T02:25:17,760 | INFO | qtp1021661463-506 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Transaction Chain Flow Writer Impl 2026-02-08T02:25:17,764 | INFO | ForkJoinPool-9-worker-1 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Number of Txn for dpId: openflow:1 is: 1 2026-02-08T02:25:17,764 | INFO | ForkJoinPool-9-worker-1 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Creating new txChain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@3343c799 for dpid: openflow:1 2026-02-08T02:25:17,839 | INFO | multiThreadIoEventLoopGroup-5-2 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 connected. 2026-02-08T02:25:17,840 | INFO | multiThreadIoEventLoopGroup-5-2 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | No context chain found for device: openflow:1, creating new. 2026-02-08T02:25:17,840 | INFO | multiThreadIoEventLoopGroup-5-2 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Device connected to controller, Device:/10.30.170.67:43046, NodeId:Uri{value=openflow:1} 2026-02-08T02:25:17,867 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config#975067533], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2026-02-08T02:25:17,868 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config#975067533], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2026-02-08T02:25:17,868 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config#975067533], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 467.0 μs 2026-02-08T02:25:17,896 | INFO | multiThreadIoEventLoopGroup-5-2 | RoleContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Started timer for setting SLAVE role on device openflow:1 if no role will be set in 20s. 2026-02-08T02:25:17,947 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:25:18,027 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:25:18,027 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting DeviceContextImpl[NEW] service for node openflow:1 2026-02-08T02:25:18,040 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting RpcContextImpl[NEW] service for node openflow:1 2026-02-08T02:25:18,109 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting StatisticsContextImpl[NEW] service for node openflow:1 2026-02-08T02:25:18,109 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting RoleContextImpl[NEW] service for node openflow:1 2026-02-08T02:25:18,111 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | SetRole called with input:SetRoleInput{controllerRole=BECOMEMASTER, node=NodeRef{value=DataObjectIdentifier[ @ urn.opendaylight.inventory.rev130819.Nodes ... nodes.Node[NodeKey{id=Uri{value=openflow:1}}] ]}} 2026-02-08T02:25:18,111 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Requesting state change to BECOMEMASTER 2026-02-08T02:25:18,111 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | RoleChangeTask called on device:openflow:1 OFPRole:BECOMEMASTER 2026-02-08T02:25:18,111 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | getGenerationIdFromDevice called for device: openflow:1 2026-02-08T02:25:18,127 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | submitRoleChange called for device:Uri{value=openflow:1}, role:BECOMEMASTER 2026-02-08T02:25:18,128 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Started clustering services for node openflow:1 2026-02-08T02:25:18,129 | INFO | multiThreadIoEventLoopGroup-5-2 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | submitRoleChange onSuccess for device:Uri{value=openflow:1}, role:BECOMEMASTER 2026-02-08T02:25:18,135 | INFO | ofppool-0 | FlowNodeReconciliationImpl | 300 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.21.2 | Triggering reconciliation for device NodeKey{id=Uri{value=openflow:1}} 2026-02-08T02:25:18,141 | INFO | pool-18-thread-1 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 connection is enabled by reconciliation framework. 2026-02-08T02:25:18,163 | INFO | multiThreadIoEventLoopGroup-5-2 | DeviceInitializationUtil | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | IP address of the node openflow:1 is: IpAddress{ipv4Address=Ipv4Address{value=10.30.170.67}} 2026-02-08T02:25:18,164 | INFO | multiThreadIoEventLoopGroup-5-2 | DeviceInitializationUtil | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Port number of the node openflow:1 is: 43046 2026-02-08T02:25:18,307 | INFO | multiThreadIoEventLoopGroup-5-2 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPMETERFEATURES collected 2026-02-08T02:25:18,319 | INFO | multiThreadIoEventLoopGroup-5-2 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPGROUPFEATURES collected 2026-02-08T02:25:18,334 | INFO | multiThreadIoEventLoopGroup-5-2 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.MacAddress 2026-02-08T02:25:18,335 | INFO | multiThreadIoEventLoopGroup-5-2 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.PhysAddress 2026-02-08T02:25:18,335 | INFO | multiThreadIoEventLoopGroup-5-2 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.HexString 2026-02-08T02:25:18,336 | INFO | multiThreadIoEventLoopGroup-5-2 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.DottedQuad 2026-02-08T02:25:18,336 | INFO | multiThreadIoEventLoopGroup-5-2 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.Uuid 2026-02-08T02:25:18,337 | INFO | multiThreadIoEventLoopGroup-5-2 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPPORTDESC collected 2026-02-08T02:25:18,365 | INFO | multiThreadIoEventLoopGroup-5-2 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 successfully finished collecting 2026-02-08T02:25:18,427 | INFO | pool-18-thread-1 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 is able to work as master 2026-02-08T02:25:18,428 | INFO | pool-18-thread-1 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Role MASTER was granted to device openflow:1 2026-02-08T02:25:18,431 | INFO | pool-18-thread-1 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Publishing node added notification for Uri{value=openflow:1} 2026-02-08T02:25:18,434 | INFO | pool-18-thread-1 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting statistics gathering for node openflow:1 2026-02-08T02:25:18,457 | INFO | opendaylight-cluster-data-notification-dispatcher-46 | ConnectionManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Clearing the device connection timer for the device 1 2026-02-08T02:25:18,467 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | LazyBindingMap | 326 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.20 | Using lazy population for maps larger than 1 element(s) 2026-02-08T02:25:18,839 | INFO | ForkJoinPool-9-worker-1 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed FlowHandlerTask thread for dpid: openflow:1 2026-02-08T02:25:19,247 | INFO | CommitFutures-3 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed all flows installation for: dpid: openflow:1 in 1486819434ns 2026-02-08T02:25:19,249 | INFO | CommitFutures-3 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Transaction chain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@3343c799 closed successfully. 2026-02-08T02:25:20,058 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Bulk Flows and Verify In Inventory Leader" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Get Bulk Flows and Verify In Inventory Leader 2026-02-08T02:25:49,871 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=19730, lastAppliedTerm=2, lastIndex=20008, lastTerm=2, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=278, mandatoryTrim=false] 2026-02-08T02:25:49,878 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=19730, term=2]/EntryInfo[index=20008, term=2] 2026-02-08T02:25:49,880 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 19729 and term: 2 2026-02-08T02:25:49,914 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: snapshot is durable as of 2026-02-08T02:25:49.879910143Z 2026-02-08T02:25:50,327 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Before Cluster Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Before Cluster Restart 2026-02-08T02:25:50,803 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1 and Exit" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1 and Exit 2026-02-08T02:25:51,176 | INFO | multiThreadIoEventLoopGroup-5-2 | SystemNotificationsListenerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Connection closed by device, Device:/10.30.170.67:43046, NodeId:openflow:1 2026-02-08T02:25:51,176 | INFO | multiThreadIoEventLoopGroup-5-2 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 disconnected. 2026-02-08T02:25:51,177 | INFO | multiThreadIoEventLoopGroup-5-2 | ReconciliationManagerImpl | 303 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.21.2 | Stopping reconciliation for node Uri{value=openflow:1} 2026-02-08T02:25:51,183 | INFO | multiThreadIoEventLoopGroup-5-2 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Publishing node removed notification for Uri{value=openflow:1} 2026-02-08T02:25:51,185 | INFO | multiThreadIoEventLoopGroup-5-2 | ReconciliationManagerImpl | 303 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.21.2 | Stopping reconciliation for node Uri{value=openflow:1} 2026-02-08T02:25:51,185 | INFO | multiThreadIoEventLoopGroup-5-2 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Role SLAVE was granted to device openflow:1 2026-02-08T02:25:51,186 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping RoleContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:25:51,186 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping StatisticsContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:25:51,187 | INFO | multiThreadIoEventLoopGroup-5-2 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping running statistics gathering for node openflow:1 2026-02-08T02:25:51,188 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping RpcContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:25:51,189 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping DeviceContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:25:51,190 | INFO | ofppool-0 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Closed clustering services for node openflow:1 2026-02-08T02:25:51,191 | INFO | multiThreadIoEventLoopGroup-5-2 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Closed clustering services registration for node openflow:1 2026-02-08T02:25:51,192 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating DeviceContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:25:51,193 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating RpcContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:25:51,193 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating StatisticsContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:25:51,194 | INFO | multiThreadIoEventLoopGroup-5-2 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping running statistics gathering for node openflow:1 2026-02-08T02:25:51,195 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating RoleContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:25:51,247 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2026-02-08T02:25:51,247 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2026-02-08T02:25:51,773 | INFO | node-cleaner-0 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Try to remove device openflow:1 from operational DS 2026-02-08T02:25:53,231 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Reconnect To Follower Node1" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Reconnect To Follower Node1 2026-02-08T02:25:55,796 | INFO | multiThreadIoEventLoopGroup-5-3 | SystemNotificationsListenerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Connection closed by device, Device:/10.30.170.67:60350, NodeId:null 2026-02-08T02:25:55,855 | INFO | multiThreadIoEventLoopGroup-5-4 | ConnectionAdapterImpl | 319 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.21.2 | Hello received 2026-02-08T02:25:55,995 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Reconnected To Follower Node1" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Reconnected To Follower Node1 2026-02-08T02:25:56,278 | INFO | multiThreadIoEventLoopGroup-5-4 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 connected. 2026-02-08T02:25:56,280 | INFO | multiThreadIoEventLoopGroup-5-4 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | No context chain found for device: openflow:1, creating new. 2026-02-08T02:25:56,280 | INFO | multiThreadIoEventLoopGroup-5-4 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Device connected to controller, Device:/10.30.170.67:60358, NodeId:Uri{value=openflow:1} 2026-02-08T02:25:56,281 | INFO | multiThreadIoEventLoopGroup-5-4 | RoleContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Started timer for setting SLAVE role on device openflow:1 if no role will be set in 20s. 2026-02-08T02:25:56,327 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:25:56,406 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:25:56,406 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting DeviceContextImpl[NEW] service for node openflow:1 2026-02-08T02:25:56,408 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting RpcContextImpl[NEW] service for node openflow:1 2026-02-08T02:25:56,411 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting StatisticsContextImpl[NEW] service for node openflow:1 2026-02-08T02:25:56,412 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting RoleContextImpl[NEW] service for node openflow:1 2026-02-08T02:25:56,412 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | SetRole called with input:SetRoleInput{controllerRole=BECOMEMASTER, node=NodeRef{value=DataObjectIdentifier[ @ urn.opendaylight.inventory.rev130819.Nodes ... nodes.Node[NodeKey{id=Uri{value=openflow:1}}] ]}} 2026-02-08T02:25:56,412 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Requesting state change to BECOMEMASTER 2026-02-08T02:25:56,413 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | RoleChangeTask called on device:openflow:1 OFPRole:BECOMEMASTER 2026-02-08T02:25:56,413 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | getGenerationIdFromDevice called for device: openflow:1 2026-02-08T02:25:56,414 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Started clustering services for node openflow:1 2026-02-08T02:25:56,415 | INFO | multiThreadIoEventLoopGroup-5-4 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | submitRoleChange called for device:Uri{value=openflow:1}, role:BECOMEMASTER 2026-02-08T02:25:56,416 | INFO | multiThreadIoEventLoopGroup-5-4 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | submitRoleChange onSuccess for device:Uri{value=openflow:1}, role:BECOMEMASTER 2026-02-08T02:25:56,419 | INFO | ofppool-0 | FlowNodeReconciliationImpl | 300 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.21.2 | Triggering reconciliation for device NodeKey{id=Uri{value=openflow:1}} 2026-02-08T02:25:56,721 | INFO | pool-18-thread-2 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 connection is enabled by reconciliation framework. 2026-02-08T02:25:56,724 | INFO | multiThreadIoEventLoopGroup-5-4 | DeviceInitializationUtil | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | IP address of the node openflow:1 is: IpAddress{ipv4Address=Ipv4Address{value=10.30.170.67}} 2026-02-08T02:25:56,724 | INFO | multiThreadIoEventLoopGroup-5-4 | DeviceInitializationUtil | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Port number of the node openflow:1 is: 60358 2026-02-08T02:25:56,733 | INFO | multiThreadIoEventLoopGroup-5-4 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPMETERFEATURES collected 2026-02-08T02:25:56,733 | INFO | multiThreadIoEventLoopGroup-5-4 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPGROUPFEATURES collected 2026-02-08T02:25:56,735 | INFO | multiThreadIoEventLoopGroup-5-4 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPPORTDESC collected 2026-02-08T02:25:56,735 | INFO | multiThreadIoEventLoopGroup-5-4 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 successfully finished collecting 2026-02-08T02:25:56,748 | INFO | pool-18-thread-2 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 is able to work as master 2026-02-08T02:25:56,749 | INFO | pool-18-thread-2 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Role MASTER was granted to device openflow:1 2026-02-08T02:25:56,749 | INFO | pool-18-thread-2 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Publishing node added notification for Uri{value=openflow:1} 2026-02-08T02:25:56,749 | INFO | pool-18-thread-2 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting statistics gathering for node openflow:1 2026-02-08T02:25:56,751 | INFO | opendaylight-cluster-data-notification-dispatcher-43 | ConnectionManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Clearing the device connection timer for the device 1 2026-02-08T02:25:58,450 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node1 2026-02-08T02:25:58,812 | INFO | multiThreadIoEventLoopGroup-5-4 | SystemNotificationsListenerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Connection closed by device, Device:/10.30.170.67:60358, NodeId:openflow:1 2026-02-08T02:25:58,813 | INFO | multiThreadIoEventLoopGroup-5-4 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 disconnected. 2026-02-08T02:25:58,813 | INFO | multiThreadIoEventLoopGroup-5-4 | ReconciliationManagerImpl | 303 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.21.2 | Stopping reconciliation for node Uri{value=openflow:1} 2026-02-08T02:25:58,815 | INFO | multiThreadIoEventLoopGroup-5-4 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Publishing node removed notification for Uri{value=openflow:1} 2026-02-08T02:25:58,816 | INFO | multiThreadIoEventLoopGroup-5-4 | ReconciliationManagerImpl | 303 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.21.2 | Stopping reconciliation for node Uri{value=openflow:1} 2026-02-08T02:25:58,816 | INFO | multiThreadIoEventLoopGroup-5-4 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Role SLAVE was granted to device openflow:1 2026-02-08T02:25:58,819 | INFO | multiThreadIoEventLoopGroup-5-4 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping RoleContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:25:58,819 | INFO | multiThreadIoEventLoopGroup-5-4 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping StatisticsContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:25:58,819 | INFO | multiThreadIoEventLoopGroup-5-4 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping running statistics gathering for node openflow:1 2026-02-08T02:25:58,819 | INFO | multiThreadIoEventLoopGroup-5-4 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping RpcContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:25:58,823 | INFO | multiThreadIoEventLoopGroup-5-4 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping DeviceContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:25:58,824 | INFO | multiThreadIoEventLoopGroup-5-4 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Closed clustering services registration for node openflow:1 2026-02-08T02:25:58,824 | INFO | ofppool-0 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Closed clustering services for node openflow:1 2026-02-08T02:25:58,824 | INFO | multiThreadIoEventLoopGroup-5-4 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating DeviceContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:25:58,825 | INFO | multiThreadIoEventLoopGroup-5-4 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating RpcContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:25:58,825 | INFO | multiThreadIoEventLoopGroup-5-4 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating StatisticsContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:25:58,825 | INFO | multiThreadIoEventLoopGroup-5-4 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping running statistics gathering for node openflow:1 2026-02-08T02:25:58,825 | INFO | multiThreadIoEventLoopGroup-5-4 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating RoleContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:25:58,897 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2026-02-08T02:25:58,898 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2026-02-08T02:25:59,403 | INFO | node-cleaner-0 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Try to remove device openflow:1 from operational DS 2026-02-08T02:26:00,853 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node2" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Follower Node2 2026-02-08T02:26:03,581 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Follower Node2" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Follower Node2 2026-02-08T02:26:04,096 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2026-02-08T02:26:04,336 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2026-02-08T02:26:05,430 | INFO | opendaylight-cluster-data-notification-dispatcher-45 | ConnectionManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Clearing the device connection timer for the device 1 2026-02-08T02:26:06,065 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node2" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Follower Node2 2026-02-08T02:26:06,496 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2026-02-08T02:26:06,776 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2026-02-08T02:26:07,282 | INFO | node-cleaner-1 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Try to remove device openflow:1 from operational DS 2026-02-08T02:26:08,406 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Inventory Leader" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Start Mininet Connect To Inventory Leader 2026-02-08T02:26:11,201 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Leader" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify Flows In Switch Connected To Leader 2026-02-08T02:26:11,576 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2026-02-08T02:26:11,777 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2026-02-08T02:26:13,010 | INFO | opendaylight-cluster-data-notification-dispatcher-45 | ConnectionManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Clearing the device connection timer for the device 1 2026-02-08T02:26:13,638 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Inventory Leader" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Stop Mininet Connected To Inventory Leader 2026-02-08T02:26:14,207 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2026-02-08T02:26:14,207 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2026-02-08T02:26:14,713 | INFO | node-cleaner-0 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Try to remove device openflow:1 from operational DS 2026-02-08T02:26:16,404 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Delete All Flows From Follower Node1" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Delete All Flows From Follower Node1 2026-02-08T02:26:16,692 | INFO | qtp1021661463-506 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Ping Pong Flow Tester Impl 2026-02-08T02:26:16,693 | INFO | qtp1021661463-506 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Transaction Chain Flow Writer Impl 2026-02-08T02:26:16,693 | INFO | ForkJoinPool-9-worker-1 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Number of Txn for dpId: openflow:1 is: 1 2026-02-08T02:26:16,694 | INFO | ForkJoinPool-9-worker-1 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Creating new txChain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@41791bfe for dpid: openflow:1 2026-02-08T02:26:16,732 | INFO | ForkJoinPool-9-worker-1 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed FlowHandlerTask thread for dpid: openflow:1 2026-02-08T02:26:16,770 | INFO | CommitFutures-2 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed all flows installation for: dpid: openflow:1 in 414866188179ns 2026-02-08T02:26:16,771 | INFO | CommitFutures-2 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Transaction chain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@41791bfe closed successfully. 2026-02-08T02:26:17,919 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify No Flows In Inventory Leader" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Reconcilliation Multi DPN.Verify No Flows In Inventory Leader 2026-02-08T02:26:34,154 | INFO | sshd-SshServer[613547ef](port=8101)-nio2-thread-2 | ServerSessionImpl | 126 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.170.170:42532 authenticated 2026-02-08T02:26:34,978 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/020__Cluster_HA_Data_Recovery_BulkFlow_2Node_Cluster.robot" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/020__Cluster_HA_Data_Recovery_BulkFlow_2Node_Cluster.robot 2026-02-08T02:26:35,196 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=39711, lastAppliedTerm=2, lastIndex=40024, lastTerm=2, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=313, mandatoryTrim=false] 2026-02-08T02:26:35,198 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=39711, term=2]/EntryInfo[index=40024, term=2] 2026-02-08T02:26:35,198 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 38765 and term: 2 2026-02-08T02:26:35,206 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: snapshot is durable as of 2026-02-08T02:26:35.198573035Z 2026-02-08T02:26:35,387 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status and Initialize Variables" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status and Initialize Variables 2026-02-08T02:26:40,033 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower Before Leader Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower Before Leader Restart 2026-02-08T02:26:41,114 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Leader From Cluster Node" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Leader From Cluster Node 2026-02-08T02:26:41,452 | INFO | pipe-log:log "ROBOT MESSAGE: Killing ODL2 10.30.170.53" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Killing ODL2 10.30.170.53 2026-02-08T02:26:41,974 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: The connection closed with error: Connection reset 2026-02-08T02:26:44,548 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Shutdown" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Shutdown 2026-02-08T02:26:46,643 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Received UnreachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:26:46,646 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#-976788989], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#-976788989], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2026-02-08T02:26:46,649 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR found unreachable members, waiting for stable-after = 7000 ms before taking downing decision. Now 1 unreachable members found. Downing decision will not be made before 2026-02-08T02:26:53.649215593Z. 2026-02-08T02:26:46,644 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Received UnreachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:26:46,651 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#793179678], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#793179678], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2026-02-08T02:26:46,652 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: refreshing backend for shard 0 2026-02-08T02:26:46,653 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config#975067533], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config#975067533], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2026-02-08T02:26:46,653 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: refreshing backend for shard 1 2026-02-08T02:26:46,654 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: refreshing backend for shard 0 2026-02-08T02:26:46,654 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#-1625142395], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#-1625142395], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} 2026-02-08T02:26:46,654 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: refreshing backend for shard 1 2026-02-08T02:26:46,655 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1161699537], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1161699537], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2026-02-08T02:26:46,655 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: refreshing backend for shard 2 2026-02-08T02:26:46,747 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:26:46,747 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:26:46,938 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | YangLibraryWriter | 292 - org.opendaylight.netconf.yanglib-mdsal-writer - 10.0.2 | ietf-yang-library writer started with modules-state enabled 2026-02-08T02:26:46,938 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | MdsalRestconfStreamRegistry | 279 - org.opendaylight.netconf.restconf-server-mdsal - 10.0.2 | Cluster leadership acquired – will write OPERATIONAL view 2026-02-08T02:26:47,231 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.170.53:2550, Up)]. 2026-02-08T02:26:49,388 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-84391018] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:26:49,388 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#2087164173] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:26:49,389 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#1633889788] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [7] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:26:49,389 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#2087164173] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [8] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:26:49,389 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-84391018] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [9] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:26:49,391 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-84391018] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [10] dead letters encountered, no more dead letters will be logged in next [5.000 min]. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:26:49,397 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:26:51,776 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.53:2550 is unreachable 2026-02-08T02:26:51,780 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Candidate): Starting new election term 3 2026-02-08T02:26:51,781 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Follower) :- Switching from behavior Follower to Candidate, election term: 3 2026-02-08T02:26:51,781 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@6cf742a1 2026-02-08T02:26:51,781 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-config , received role change from Follower to Candidate 2026-02-08T02:26:51,781 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-default-config from Follower to Candidate 2026-02-08T02:26:51,786 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.53:2550 is unreachable 2026-02-08T02:26:51,789 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Candidate): Starting new election term 3 2026-02-08T02:26:51,790 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Follower) :- Switching from behavior Follower to Candidate, election term: 3 2026-02-08T02:26:51,790 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@1a5662e4 2026-02-08T02:26:51,790 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from Follower to Candidate 2026-02-08T02:26:51,790 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-toaster-config from Follower to Candidate 2026-02-08T02:26:51,800 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 3 2026-02-08T02:26:51,800 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 3 2026-02-08T02:26:51,801 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@243d16ed 2026-02-08T02:26:51,801 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@70352a61 2026-02-08T02:26:51,801 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from Candidate to Leader 2026-02-08T02:26:51,801 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-toaster-config from Candidate to Leader 2026-02-08T02:26:51,802 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-config , received role change from Candidate to Leader 2026-02-08T02:26:51,802 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-default-config from Candidate to Leader 2026-02-08T02:26:51,805 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-default-config#1781936825], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present} 2026-02-08T02:26:51,805 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#793179678], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-default-config#1781936825], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} 2026-02-08T02:26:51,806 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.53:2550 is unreachable 2026-02-08T02:26:51,808 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#793179678], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-config/member-1-shard-default-config#1781936825], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} in 2.580 ms 2026-02-08T02:26:51,810 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Candidate): Starting new election term 3 2026-02-08T02:26:51,810 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 3 2026-02-08T02:26:51,810 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-operational , received role change from Follower to Candidate 2026-02-08T02:26:51,810 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@5a69156c 2026-02-08T02:26:51,810 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-topology-operational from Follower to Candidate 2026-02-08T02:26:51,816 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.53:2550 is unreachable 2026-02-08T02:26:51,820 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Candidate): Starting new election term 3 2026-02-08T02:26:51,820 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 3 2026-02-08T02:26:51,820 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-operational , received role change from Follower to Candidate 2026-02-08T02:26:51,820 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@2716666f 2026-02-08T02:26:51,821 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-default-operational from Follower to Candidate 2026-02-08T02:26:51,821 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 3 2026-02-08T02:26:51,821 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@6bd0961e 2026-02-08T02:26:51,822 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-operational , received role change from Candidate to Leader 2026-02-08T02:26:51,822 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-topology-operational from Candidate to Leader 2026-02-08T02:26:51,822 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-topology-operational#-1973432609], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=present} 2026-02-08T02:26:51,822 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#-1625142395], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-topology-operational#-1973432609], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=present}} 2026-02-08T02:26:51,823 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#-1625142395], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-topology-operational#-1973432609], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=present}} in 417.6 μs 2026-02-08T02:26:51,829 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 3 2026-02-08T02:26:51,829 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@32efdf27 2026-02-08T02:26:51,829 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-operational , received role change from Candidate to Leader 2026-02-08T02:26:51,829 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-default-operational from Candidate to Leader 2026-02-08T02:26:51,830 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#-1744605415], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present} 2026-02-08T02:26:51,830 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#-976788989], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#-1744605415], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} 2026-02-08T02:26:51,833 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational (Follower): Term 3 in "RequestVote{term=3, candidateId=member-3-shard-toaster-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 2 - updating term 2026-02-08T02:26:51,840 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#-976788989], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-default-operational#-1744605415], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=present}} in 9.277 ms 2026-02-08T02:26:51,845 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.53:2550 is unreachable 2026-02-08T02:26:51,849 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Candidate): Starting new election term 3 2026-02-08T02:26:51,849 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Follower) :- Switching from behavior Follower to Candidate, election term: 3 2026-02-08T02:26:51,849 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@4ee673ec 2026-02-08T02:26:51,849 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-config , received role change from Follower to Candidate 2026-02-08T02:26:51,849 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-topology-config from Follower to Candidate 2026-02-08T02:26:51,856 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-toaster-operational status sync done false 2026-02-08T02:26:51,856 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@6a5e15ae 2026-02-08T02:26:51,896 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.53:2550 is unreachable 2026-02-08T02:26:51,900 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config (Candidate): Starting new election term 3 2026-02-08T02:26:51,900 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config (Follower) :- Switching from behavior Follower to Candidate, election term: 3 2026-02-08T02:26:51,900 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from Follower to Candidate 2026-02-08T02:26:51,900 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@36028742 2026-02-08T02:26:51,901 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-inventory-config from Follower to Candidate 2026-02-08T02:26:51,906 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.53:2550 is unreachable 2026-02-08T02:26:51,908 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Candidate): Starting new election term 3 2026-02-08T02:26:51,909 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 3 2026-02-08T02:26:51,909 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@4b757ec1 2026-02-08T02:26:51,909 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from Follower to Candidate 2026-02-08T02:26:51,909 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from Follower to Candidate 2026-02-08T02:26:51,918 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 3 2026-02-08T02:26:51,919 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from Candidate to Leader 2026-02-08T02:26:51,919 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@46d733b 2026-02-08T02:26:51,920 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from Candidate to Leader 2026-02-08T02:26:51,920 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: All Shards are ready - data store operational is ready 2026-02-08T02:26:51,921 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolved shard 2 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#-1349493913], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=present} 2026-02-08T02:26:51,921 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1161699537], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#-1349493913], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=present}} 2026-02-08T02:26:51,922 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational#1161699537], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=0}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#-1349493913], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=present}} in 533.7 μs 2026-02-08T02:26:52,369 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-toaster-operational status sync done true 2026-02-08T02:26:54,364 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR took decision DownUnreachable and is downing [pekko://opendaylight-cluster-data@10.30.170.53:2550], [1] unreachable of [3] members, all members in DC [Member(pekko://opendaylight-cluster-data@10.30.170.226:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.170.53:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.171.87:2550, Up)], full reachability status: [pekko://opendaylight-cluster-data@10.30.170.226:2550 -> pekko://opendaylight-cluster-data@10.30.170.53:2550: Unreachable [Unreachable] (1), pekko://opendaylight-cluster-data@10.30.171.87:2550 -> pekko://opendaylight-cluster-data@10.30.170.53:2550: Unreachable [Unreachable] (1)] 2026-02-08T02:26:54,365 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR is downing [UniqueAddress(pekko://opendaylight-cluster-data@10.30.170.53:2550,-1914198052395584601)] 2026-02-08T02:26:54,367 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Marking unreachable node [pekko://opendaylight-cluster-data@10.30.170.53:2550] as [Down] 2026-02-08T02:26:54,368 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR found unreachable members changed during stable-after period. Resetting timer. Now 1 unreachable members found. Downing decision will not be made before 2026-02-08T02:27:01.368679884Z. 2026-02-08T02:26:54,376 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_RETAINED_WITH_NO_CHANGE [wasOwner=true, isOwner=true, hasOwner=true] 2026-02-08T02:26:54,376 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_RETAINED_WITH_NO_CHANGE [wasOwner=true, isOwner=true, hasOwner=true] 2026-02-08T02:26:55,378 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is removing unreachable node [pekko://opendaylight-cluster-data@10.30.170.53:2550] 2026-02-08T02:26:55,381 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received MemberRemoved: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:26:55,381 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received MemberRemoved: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:26:55,383 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | Association | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Association to [pekko://opendaylight-cluster-data@10.30.170.53:2550] with UID [-1914198052395584601] is irrecoverably failed. UID is now quarantined and all messages to this UID will be delivered to dead letters. Remote ActorSystem must be restarted to recover from this situation. Reason: Cluster member removed, previous status [Down] 2026-02-08T02:26:57,005 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:26:59,605 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:01,950 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Candidate): Starting new election term 4 2026-02-08T02:27:01,962 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 4 2026-02-08T02:27:01,962 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-config , received role change from Candidate to Leader 2026-02-08T02:27:01,962 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@70892688 2026-02-08T02:27:01,963 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-topology-config from Candidate to Leader 2026-02-08T02:27:01,964 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config (Candidate): Term 4 in "RequestVote{term=4, candidateId=member-3-shard-inventory-config, lastLogIndex=40024, lastLogTerm=2}" message is greater than Candidate's term 3 - switching to Follower 2026-02-08T02:27:01,970 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 4 2026-02-08T02:27:01,970 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from Candidate to Follower 2026-02-08T02:27:01,971 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-inventory-config from Candidate to Follower 2026-02-08T02:27:01,972 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done false 2026-02-08T02:27:01,972 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@2e0d884c 2026-02-08T02:27:01,973 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: All Shards are ready - data store config is ready 2026-02-08T02:27:01,975 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done true 2026-02-08T02:27:01,981 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-inventory-config#-88502937], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2026-02-08T02:27:01,981 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config#975067533], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-inventory-config#-88502937], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2026-02-08T02:27:01,982 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config#975067533], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=0}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-inventory-config#-88502937], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 353.2 μs 2026-02-08T02:27:02,727 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:03,766 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:04,284 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:04,388 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shard Status For Leader After PreLeader Shutdown" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shard Status For Leader After PreLeader Shutdown 2026-02-08T02:27:04,807 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:05,180 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node1" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node1 2026-02-08T02:27:05,849 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:07,768 | INFO | multiThreadIoEventLoopGroup-5-5 | SystemNotificationsListenerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Connection closed by device, Device:/10.30.170.67:34040, NodeId:null 2026-02-08T02:27:07,827 | INFO | multiThreadIoEventLoopGroup-5-6 | ConnectionAdapterImpl | 319 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.21.2 | Hello received 2026-02-08T02:27:07,950 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower 2026-02-08T02:27:08,147 | INFO | qtp1021661463-560 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Ping Pong Flow Tester Impl 2026-02-08T02:27:08,147 | INFO | qtp1021661463-560 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Transaction Chain Flow Writer Impl 2026-02-08T02:27:08,148 | INFO | ForkJoinPool-9-worker-1 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Number of Txn for dpId: openflow:1 is: 1 2026-02-08T02:27:08,148 | INFO | ForkJoinPool-9-worker-1 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Creating new txChain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@5e608b74 for dpid: openflow:1 2026-02-08T02:27:08,327 | INFO | multiThreadIoEventLoopGroup-5-6 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 connected. 2026-02-08T02:27:08,327 | INFO | multiThreadIoEventLoopGroup-5-6 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | No context chain found for device: openflow:1, creating new. 2026-02-08T02:27:08,328 | INFO | multiThreadIoEventLoopGroup-5-6 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Device connected to controller, Device:/10.30.170.67:34048, NodeId:Uri{value=openflow:1} 2026-02-08T02:27:08,328 | INFO | multiThreadIoEventLoopGroup-5-6 | RoleContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Started timer for setting SLAVE role on device openflow:1 if no role will be set in 20s. 2026-02-08T02:27:08,376 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:27:08,456 | INFO | ForkJoinPool-9-worker-1 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed FlowHandlerTask thread for dpid: openflow:1 2026-02-08T02:27:08,466 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:27:08,466 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting DeviceContextImpl[NEW] service for node openflow:1 2026-02-08T02:27:08,467 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting RpcContextImpl[NEW] service for node openflow:1 2026-02-08T02:27:08,468 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting StatisticsContextImpl[NEW] service for node openflow:1 2026-02-08T02:27:08,468 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting RoleContextImpl[NEW] service for node openflow:1 2026-02-08T02:27:08,468 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | SetRole called with input:SetRoleInput{controllerRole=BECOMEMASTER, node=NodeRef{value=DataObjectIdentifier[ @ urn.opendaylight.inventory.rev130819.Nodes ... nodes.Node[NodeKey{id=Uri{value=openflow:1}}] ]}} 2026-02-08T02:27:08,468 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Requesting state change to BECOMEMASTER 2026-02-08T02:27:08,468 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | RoleChangeTask called on device:openflow:1 OFPRole:BECOMEMASTER 2026-02-08T02:27:08,468 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | getGenerationIdFromDevice called for device: openflow:1 2026-02-08T02:27:08,468 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Started clustering services for node openflow:1 2026-02-08T02:27:08,470 | INFO | multiThreadIoEventLoopGroup-5-6 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | submitRoleChange called for device:Uri{value=openflow:1}, role:BECOMEMASTER 2026-02-08T02:27:08,471 | INFO | multiThreadIoEventLoopGroup-5-6 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | submitRoleChange onSuccess for device:Uri{value=openflow:1}, role:BECOMEMASTER 2026-02-08T02:27:08,476 | INFO | ofppool-0 | FlowNodeReconciliationImpl | 300 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.21.2 | Triggering reconciliation for device NodeKey{id=Uri{value=openflow:1}} 2026-02-08T02:27:08,798 | INFO | pool-18-thread-3 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 connection is enabled by reconciliation framework. 2026-02-08T02:27:08,802 | INFO | multiThreadIoEventLoopGroup-5-6 | DeviceInitializationUtil | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | IP address of the node openflow:1 is: IpAddress{ipv4Address=Ipv4Address{value=10.30.170.67}} 2026-02-08T02:27:08,802 | INFO | multiThreadIoEventLoopGroup-5-6 | DeviceInitializationUtil | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Port number of the node openflow:1 is: 34048 2026-02-08T02:27:08,810 | INFO | multiThreadIoEventLoopGroup-5-6 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPMETERFEATURES collected 2026-02-08T02:27:08,810 | INFO | multiThreadIoEventLoopGroup-5-6 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPGROUPFEATURES collected 2026-02-08T02:27:08,811 | INFO | multiThreadIoEventLoopGroup-5-6 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPPORTDESC collected 2026-02-08T02:27:08,811 | INFO | multiThreadIoEventLoopGroup-5-6 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 successfully finished collecting 2026-02-08T02:27:08,819 | INFO | pool-18-thread-3 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 is able to work as master 2026-02-08T02:27:08,822 | INFO | opendaylight-cluster-data-notification-dispatcher-45 | ConnectionManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Clearing the device connection timer for the device 1 2026-02-08T02:27:08,820 | INFO | pool-18-thread-3 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Role MASTER was granted to device openflow:1 2026-02-08T02:27:08,823 | INFO | pool-18-thread-3 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Publishing node added notification for Uri{value=openflow:1} 2026-02-08T02:27:08,824 | INFO | pool-18-thread-3 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting statistics gathering for node openflow:1 2026-02-08T02:27:08,842 | INFO | CommitFutures-3 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed all flows installation for: dpid: openflow:1 in 694502944ns 2026-02-08T02:27:08,842 | INFO | CommitFutures-3 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Transaction chain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@5e608b74 closed successfully. 2026-02-08T02:27:09,415 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader 2026-02-08T02:27:11,047 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:11,565 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:12,605 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:14,165 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:16,245 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:17,283 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:17,805 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:18,326 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:19,895 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:21,454 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:21,974 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:23,014 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:27,694 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:31,855 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:33,934 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:34,971 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:36,014 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:36,533 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:37,051 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:38,275 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Cluster Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Cluster Restart 2026-02-08T02:27:38,569 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=59663, lastAppliedTerm=4, lastIndex=60034, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=371, mandatoryTrim=false] 2026-02-08T02:27:38,570 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=59663, term=4]/EntryInfo[index=60034, term=4] 2026-02-08T02:27:38,570 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 59663 and term: 4 2026-02-08T02:27:38,605 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: snapshot is durable as of 2026-02-08T02:27:38.570263250Z 2026-02-08T02:27:38,613 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Pre Leader From Cluster Node" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Pre Leader From Cluster Node 2026-02-08T02:27:38,837 | INFO | pipe-log:log "ROBOT MESSAGE: Starting ODL2 10.30.170.53" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting ODL2 10.30.170.53 2026-02-08T02:27:39,136 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:39,652 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:40,172 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:42,252 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:27:44,984 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#1353499461]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2026-02-08T02:27:44,985 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.170.226:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#1353499461]] (version [1.2.1]) 2026-02-08T02:27:45,524 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Node [pekko://opendaylight-cluster-data@10.30.170.53:2550] is JOINING, roles [member-2, dc-default], version [0.0.0] 2026-02-08T02:27:46,377 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.170.53:2550] to [Up] 2026-02-08T02:27:46,378 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:27:46,378 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:27:46,378 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config 2026-02-08T02:27:46,378 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational 2026-02-08T02:27:46,378 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-topology-config 2026-02-08T02:27:46,378 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational 2026-02-08T02:27:46,379 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config 2026-02-08T02:27:46,379 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2026-02-08T02:27:46,379 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: Peer address for peer member-2-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config 2026-02-08T02:27:46,379 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-toaster-config 2026-02-08T02:27:46,379 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2026-02-08T02:27:46,379 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: Peer address for peer member-2-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-topology-config 2026-02-08T02:27:46,379 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: All Shards are ready - data store operational is ready 2026-02-08T02:27:46,379 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: All Shards are ready - data store config is ready 2026-02-08T02:27:46,379 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Peer address for peer member-2-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config 2026-02-08T02:27:46,379 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Peer address for peer member-2-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational 2026-02-08T02:27:46,379 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Peer address for peer member-2-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational 2026-02-08T02:27:46,379 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: Peer address for peer member-2-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2026-02-08T02:27:46,379 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Peer address for peer member-2-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2026-02-08T02:27:46,379 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: Peer address for peer member-2-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-toaster-config 2026-02-08T02:27:49,641 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=3, success=false, followerId=member-2-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 38, snapshotTerm: 2, replicatedToAllIndex: -1 2026-02-08T02:27:49,642 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Leader): follower member-2-shard-topology-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2026-02-08T02:27:49,642 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Leader): Initiating install snapshot to follower member-2-shard-topology-operational: follower nextIndex: 0, leader snapshotIndex: 38, leader lastIndex: 42, leader log size: 4 2026-02-08T02:27:49,642 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=42, lastAppliedTerm=3, lastIndex=42, lastTerm=3, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-2-shard-topology-operational 2026-02-08T02:27:49,645 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Persising snapshot at EntryInfo[index=42, term=3]/EntryInfo[index=42, term=3] 2026-02-08T02:27:49,645 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 38 and term: 2 2026-02-08T02:27:49,648 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=3, success=false, followerId=member-2-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 108, snapshotTerm: 2, replicatedToAllIndex: -1 2026-02-08T02:27:49,648 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Leader): follower member-2-shard-default-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2026-02-08T02:27:49,648 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Leader): Initiating install snapshot to follower member-2-shard-default-operational: follower nextIndex: 0, leader snapshotIndex: 108, leader lastIndex: 168, leader log size: 60 2026-02-08T02:27:49,649 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=168, lastAppliedTerm=3, lastIndex=168, lastTerm=3, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-2-shard-default-operational 2026-02-08T02:27:49,650 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Persising snapshot at EntryInfo[index=168, term=3]/EntryInfo[index=168, term=3] 2026-02-08T02:27:49,650 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 108 and term: 2 2026-02-08T02:27:49,652 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: snapshot is durable as of 2026-02-08T02:27:49.645330269Z 2026-02-08T02:27:49,653 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: snapshot is durable as of 2026-02-08T02:27:49.650757930Z 2026-02-08T02:27:49,720 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Leader): Snapshot successfully installed on follower member-2-shard-topology-operational (last chunk 1) - matchIndex set to 42, nextIndex set to 43 2026-02-08T02:27:49,742 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Leader): Snapshot successfully installed on follower member-2-shard-default-operational (last chunk 1) - matchIndex set to 168, nextIndex set to 169 2026-02-08T02:27:49,826 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=3, success=false, followerId=member-2-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 178, snapshotTerm: 2, replicatedToAllIndex: -1 2026-02-08T02:27:49,827 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): follower member-2-shard-inventory-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2026-02-08T02:27:49,827 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): Initiating install snapshot to follower member-2-shard-inventory-operational: follower nextIndex: 0, leader snapshotIndex: 178, leader lastIndex: 290, leader log size: 112 2026-02-08T02:27:49,827 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=290, lastAppliedTerm=3, lastIndex=290, lastTerm=3, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-2-shard-inventory-operational 2026-02-08T02:27:49,866 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Persising snapshot at EntryInfo[index=290, term=3]/EntryInfo[index=290, term=3] 2026-02-08T02:27:49,867 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 178 and term: 2 2026-02-08T02:27:49,870 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: snapshot is durable as of 2026-02-08T02:27:49.866893646Z 2026-02-08T02:27:50,051 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-default-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=0}, nanosAgo=58249673502, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1} 2026-02-08T02:27:50,212 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-default-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=0}, nanosAgo=58383047734, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1} 2026-02-08T02:27:50,772 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): Snapshot successfully installed on follower member-2-shard-inventory-operational (last chunk 3) - matchIndex set to 290, nextIndex set to 291 2026-02-08T02:27:51,406 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-inventory-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=0}, nanosAgo=59485965902, purgedHistories=MutableUnsignedLongSet{span=[6..6], size=1}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1} 2026-02-08T02:28:02,793 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Leader Restart 2026-02-08T02:28:07,669 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Leader Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Leader Restart 2026-02-08T02:28:47,393 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Leader Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Leader Restart 2026-02-08T02:28:47,863 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node1" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node1 2026-02-08T02:28:48,256 | INFO | multiThreadIoEventLoopGroup-5-6 | SystemNotificationsListenerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Connection closed by device, Device:/10.30.170.67:34048, NodeId:openflow:1 2026-02-08T02:28:48,257 | INFO | multiThreadIoEventLoopGroup-5-6 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 disconnected. 2026-02-08T02:28:48,257 | INFO | multiThreadIoEventLoopGroup-5-6 | ReconciliationManagerImpl | 303 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.21.2 | Stopping reconciliation for node Uri{value=openflow:1} 2026-02-08T02:28:48,258 | INFO | multiThreadIoEventLoopGroup-5-6 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Publishing node removed notification for Uri{value=openflow:1} 2026-02-08T02:28:48,259 | INFO | multiThreadIoEventLoopGroup-5-6 | ReconciliationManagerImpl | 303 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.21.2 | Stopping reconciliation for node Uri{value=openflow:1} 2026-02-08T02:28:48,259 | INFO | multiThreadIoEventLoopGroup-5-6 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Role SLAVE was granted to device openflow:1 2026-02-08T02:28:48,259 | INFO | multiThreadIoEventLoopGroup-5-6 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping RoleContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:28:48,260 | INFO | multiThreadIoEventLoopGroup-5-6 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping StatisticsContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:28:48,260 | INFO | multiThreadIoEventLoopGroup-5-6 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping running statistics gathering for node openflow:1 2026-02-08T02:28:48,260 | INFO | multiThreadIoEventLoopGroup-5-6 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping RpcContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:28:48,261 | INFO | multiThreadIoEventLoopGroup-5-6 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping DeviceContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:28:48,262 | INFO | multiThreadIoEventLoopGroup-5-6 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Closed clustering services registration for node openflow:1 2026-02-08T02:28:48,262 | INFO | multiThreadIoEventLoopGroup-5-6 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating DeviceContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:28:48,262 | INFO | multiThreadIoEventLoopGroup-5-6 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating RpcContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:28:48,262 | INFO | multiThreadIoEventLoopGroup-5-6 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating StatisticsContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:28:48,262 | INFO | multiThreadIoEventLoopGroup-5-6 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping running statistics gathering for node openflow:1 2026-02-08T02:28:48,263 | INFO | multiThreadIoEventLoopGroup-5-6 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating RoleContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:28:48,263 | INFO | ofppool-0 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Closed clustering services for node openflow:1 2026-02-08T02:28:48,317 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2026-02-08T02:28:48,317 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2026-02-08T02:28:48,824 | INFO | node-cleaner-2 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Try to remove device openflow:1 from operational DS 2026-02-08T02:28:50,559 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node1" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node1 2026-02-08T02:28:50,848 | INFO | qtp1021661463-508 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Ping Pong Flow Tester Impl 2026-02-08T02:28:50,848 | INFO | qtp1021661463-508 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Transaction Chain Flow Writer Impl 2026-02-08T02:28:50,849 | INFO | ForkJoinPool-9-worker-2 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Number of Txn for dpId: openflow:1 is: 1 2026-02-08T02:28:50,849 | INFO | ForkJoinPool-9-worker-2 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Creating new txChain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@5331f15 for dpid: openflow:1 2026-02-08T02:28:50,885 | INFO | ForkJoinPool-9-worker-2 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed FlowHandlerTask thread for dpid: openflow:1 2026-02-08T02:28:51,301 | INFO | CommitFutures-4 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed all flows installation for: dpid: openflow:1 in 569396821719ns 2026-02-08T02:28:51,302 | INFO | CommitFutures-4 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Transaction chain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@5331f15 closed successfully. 2026-02-08T02:28:51,497 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=79177, lastAppliedTerm=4, lastIndex=80034, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=857, mandatoryTrim=false] 2026-02-08T02:28:51,499 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=79177, term=4]/EntryInfo[index=80034, term=4] 2026-02-08T02:28:51,499 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 79176 and term: 4 2026-02-08T02:28:51,541 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: snapshot is durable as of 2026-02-08T02:28:51.499416451Z 2026-02-08T02:28:52,089 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node 2026-02-08T02:29:09,409 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower And Leader Before Cluster Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Inventory Follower And Leader Before Cluster Restart 2026-02-08T02:29:11,207 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Follower From Cluster Node" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Shutdown Follower From Cluster Node 2026-02-08T02:29:11,461 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=99581, lastAppliedTerm=4, lastIndex=100039, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=458, mandatoryTrim=false] 2026-02-08T02:29:11,461 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=99581, term=4]/EntryInfo[index=100039, term=4] 2026-02-08T02:29:11,461 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 99580 and term: 4 2026-02-08T02:29:11,468 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: snapshot is durable as of 2026-02-08T02:29:11.461897048Z 2026-02-08T02:29:11,512 | INFO | pipe-log:log "ROBOT MESSAGE: Killing ODL2 10.30.170.53" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Killing ODL2 10.30.170.53 2026-02-08T02:29:15,484 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Shutdown" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Shutdown 2026-02-08T02:29:17,157 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.170.53:2550, Up)]. 2026-02-08T02:29:17,158 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR found unreachable members, waiting for stable-after = 7000 ms before taking downing decision. Now 1 unreachable members found. Downing decision will not be made before 2026-02-08T02:29:24.158416016Z. 2026-02-08T02:29:17,159 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Received UnreachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:29:17,159 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Received UnreachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:29:20,022 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:29:24,207 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR took decision DownUnreachable and is downing [pekko://opendaylight-cluster-data@10.30.170.53:2550], [1] unreachable of [3] members, all members in DC [Member(pekko://opendaylight-cluster-data@10.30.170.226:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.170.53:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.171.87:2550, Up)], full reachability status: [pekko://opendaylight-cluster-data@10.30.170.226:2550 -> pekko://opendaylight-cluster-data@10.30.170.53:2550: Unreachable [Unreachable] (2), pekko://opendaylight-cluster-data@10.30.171.87:2550 -> pekko://opendaylight-cluster-data@10.30.170.53:2550: Unreachable [Unreachable] (2)] 2026-02-08T02:29:24,208 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR is downing [UniqueAddress(pekko://opendaylight-cluster-data@10.30.170.53:2550,-7515012701594897351)] 2026-02-08T02:29:24,209 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Marking unreachable node [pekko://opendaylight-cluster-data@10.30.170.53:2550] as [Down] 2026-02-08T02:29:24,210 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-39 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR found unreachable members changed during stable-after period. Resetting timer. Now 1 unreachable members found. Downing decision will not be made before 2026-02-08T02:29:31.209909002Z. 2026-02-08T02:29:25,298 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is removing unreachable node [pekko://opendaylight-cluster-data@10.30.170.53:2550] 2026-02-08T02:29:25,298 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received MemberRemoved: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:29:25,299 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received MemberRemoved: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:29:25,299 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | Association | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Association to [pekko://opendaylight-cluster-data@10.30.170.53:2550] with UID [-7515012701594897351] is irrecoverably failed. UID is now quarantined and all messages to this UID will be delivered to dead letters. Remote ActorSystem must be restarted to recover from this situation. Reason: Cluster member removed, previous status [Down] 2026-02-08T02:29:27,584 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:29:28,195 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Start Mininet Connect To Follower Node 2026-02-08T02:29:28,814 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:29:30,783 | INFO | multiThreadIoEventLoopGroup-5-7 | SystemNotificationsListenerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Connection closed by device, Device:/10.30.170.67:46412, NodeId:null 2026-02-08T02:29:30,848 | INFO | multiThreadIoEventLoopGroup-5-8 | ConnectionAdapterImpl | 319 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.21.2 | Hello received 2026-02-08T02:29:30,983 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower Node1" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Add Bulk Flow From Follower Node1 2026-02-08T02:29:31,177 | INFO | qtp1021661463-560 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Ping Pong Flow Tester Impl 2026-02-08T02:29:31,178 | INFO | qtp1021661463-560 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Transaction Chain Flow Writer Impl 2026-02-08T02:29:31,178 | INFO | ForkJoinPool-9-worker-2 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Number of Txn for dpId: openflow:1 is: 1 2026-02-08T02:29:31,178 | INFO | ForkJoinPool-9-worker-2 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Creating new txChain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@73a87729 for dpid: openflow:1 2026-02-08T02:29:31,347 | INFO | multiThreadIoEventLoopGroup-5-8 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 connected. 2026-02-08T02:29:31,348 | INFO | multiThreadIoEventLoopGroup-5-8 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | No context chain found for device: openflow:1, creating new. 2026-02-08T02:29:31,348 | INFO | multiThreadIoEventLoopGroup-5-8 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Device connected to controller, Device:/10.30.170.67:46416, NodeId:Uri{value=openflow:1} 2026-02-08T02:29:31,349 | INFO | multiThreadIoEventLoopGroup-5-8 | RoleContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Started timer for setting SLAVE role on device openflow:1 if no role will be set in 20s. 2026-02-08T02:29:31,402 | INFO | ForkJoinPool-9-worker-2 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed FlowHandlerTask thread for dpid: openflow:1 2026-02-08T02:29:31,407 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:29:31,486 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-9 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:29:31,486 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting DeviceContextImpl[NEW] service for node openflow:1 2026-02-08T02:29:31,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting RpcContextImpl[NEW] service for node openflow:1 2026-02-08T02:29:31,489 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting StatisticsContextImpl[NEW] service for node openflow:1 2026-02-08T02:29:31,490 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting RoleContextImpl[NEW] service for node openflow:1 2026-02-08T02:29:31,490 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | SetRole called with input:SetRoleInput{controllerRole=BECOMEMASTER, node=NodeRef{value=DataObjectIdentifier[ @ urn.opendaylight.inventory.rev130819.Nodes ... nodes.Node[NodeKey{id=Uri{value=openflow:1}}] ]}} 2026-02-08T02:29:31,490 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Requesting state change to BECOMEMASTER 2026-02-08T02:29:31,490 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | RoleChangeTask called on device:openflow:1 OFPRole:BECOMEMASTER 2026-02-08T02:29:31,490 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | getGenerationIdFromDevice called for device: openflow:1 2026-02-08T02:29:31,491 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Started clustering services for node openflow:1 2026-02-08T02:29:31,492 | INFO | multiThreadIoEventLoopGroup-5-8 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | submitRoleChange called for device:Uri{value=openflow:1}, role:BECOMEMASTER 2026-02-08T02:29:31,494 | INFO | multiThreadIoEventLoopGroup-5-8 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | submitRoleChange onSuccess for device:Uri{value=openflow:1}, role:BECOMEMASTER 2026-02-08T02:29:31,499 | INFO | ofppool-0 | FlowNodeReconciliationImpl | 300 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.21.2 | Triggering reconciliation for device NodeKey{id=Uri{value=openflow:1}} 2026-02-08T02:29:31,632 | INFO | pool-18-thread-4 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 connection is enabled by reconciliation framework. 2026-02-08T02:29:31,635 | INFO | multiThreadIoEventLoopGroup-5-8 | DeviceInitializationUtil | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | IP address of the node openflow:1 is: IpAddress{ipv4Address=Ipv4Address{value=10.30.170.67}} 2026-02-08T02:29:31,635 | INFO | multiThreadIoEventLoopGroup-5-8 | DeviceInitializationUtil | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Port number of the node openflow:1 is: 46416 2026-02-08T02:29:31,639 | INFO | CommitFutures-2 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed all flows installation for: dpid: openflow:1 in 461574659ns 2026-02-08T02:29:31,640 | INFO | multiThreadIoEventLoopGroup-5-8 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPMETERFEATURES collected 2026-02-08T02:29:31,641 | INFO | multiThreadIoEventLoopGroup-5-8 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPGROUPFEATURES collected 2026-02-08T02:29:31,641 | INFO | multiThreadIoEventLoopGroup-5-8 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPPORTDESC collected 2026-02-08T02:29:31,641 | INFO | multiThreadIoEventLoopGroup-5-8 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 successfully finished collecting 2026-02-08T02:29:31,642 | INFO | CommitFutures-4 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Transaction chain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@73a87729 closed successfully. 2026-02-08T02:29:31,649 | INFO | opendaylight-cluster-data-notification-dispatcher-46 | ConnectionManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Clearing the device connection timer for the device 1 2026-02-08T02:29:31,648 | INFO | pool-18-thread-4 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 is able to work as master 2026-02-08T02:29:31,650 | INFO | pool-18-thread-4 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Role MASTER was granted to device openflow:1 2026-02-08T02:29:31,650 | INFO | pool-18-thread-4 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Publishing node added notification for Uri{value=openflow:1} 2026-02-08T02:29:31,651 | INFO | pool-18-thread-4 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting statistics gathering for node openflow:1 2026-02-08T02:29:32,432 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader Before Follower Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Get Bulk Flows And Verify In Leader Before Follower Restart 2026-02-08T02:29:33,495 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:29:34,532 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:29:37,653 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:29:38,716 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:29:39,233 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:29:40,793 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:29:41,314 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:29:43,392 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:29:48,624 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Follower Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch Before Follower Restart 2026-02-08T02:29:49,010 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Follower From Cluster Node" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Restart Follower From Cluster Node 2026-02-08T02:29:49,216 | INFO | pipe-log:log "ROBOT MESSAGE: Starting ODL2 10.30.170.53" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting ODL2 10.30.170.53 2026-02-08T02:29:50,397 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=119230, lastAppliedTerm=4, lastIndex=120048, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=818, mandatoryTrim=false] 2026-02-08T02:29:50,398 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=119230, term=4]/EntryInfo[index=120048, term=4] 2026-02-08T02:29:50,399 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 119230 and term: 4 2026-02-08T02:29:50,428 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: snapshot is durable as of 2026-02-08T02:29:50.399019393Z 2026-02-08T02:29:50,682 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:29:51,722 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:29:53,802 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:29:54,880 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-1266973981]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2026-02-08T02:29:54,881 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.170.226:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-1266973981]] (version [1.2.1]) 2026-02-08T02:29:54,913 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-37 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Node [pekko://opendaylight-cluster-data@10.30.170.53:2550] is JOINING, roles [member-2, dc-default], version [0.0.0] 2026-02-08T02:29:55,897 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.170.53:2550] to [Up] 2026-02-08T02:29:55,897 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:29:55,897 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-topology-config 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Peer address for peer member-2-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: Peer address for peer member-2-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Peer address for peer member-2-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-toaster-config 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: Peer address for peer member-2-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-topology-config 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Peer address for peer member-2-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: All Shards are ready - data store config is ready 2026-02-08T02:29:55,899 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: Peer address for peer member-2-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2026-02-08T02:29:55,899 | INFO | opendaylight-cluster-data-shard-dispatcher-31 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Peer address for peer member-2-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config 2026-02-08T02:29:55,899 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: Peer address for peer member-2-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-toaster-config 2026-02-08T02:29:55,898 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: All Shards are ready - data store operational is ready 2026-02-08T02:29:59,214 | WARN | opendaylight-cluster-data-shard-dispatcher-34 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=3, success=false, followerId=member-2-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 46986, lastApplied : 52, commitIndex : 52 2026-02-08T02:29:59,214 | WARN | opendaylight-cluster-data-shard-dispatcher-29 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=3, success=false, followerId=member-2-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 46986, lastApplied : 243, commitIndex : 243 2026-02-08T02:29:59,215 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=3, success=false, followerId=member-2-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 50, snapshotTerm: 3, replicatedToAllIndex: 50 2026-02-08T02:29:59,215 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Leader): follower member-2-shard-topology-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2026-02-08T02:29:59,215 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=3, success=false, followerId=member-2-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 219, snapshotTerm: 3, replicatedToAllIndex: 219 2026-02-08T02:29:59,215 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Leader): Initiating install snapshot to follower member-2-shard-topology-operational: follower nextIndex: 0, leader snapshotIndex: 50, leader lastIndex: 52, leader log size: 2 2026-02-08T02:29:59,215 | WARN | opendaylight-cluster-data-shard-dispatcher-33 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=3, success=false, followerId=member-2-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 47037, lastApplied : 527, commitIndex : 527 2026-02-08T02:29:59,215 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Leader): follower member-2-shard-default-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2026-02-08T02:29:59,215 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=52, lastAppliedTerm=3, lastIndex=52, lastTerm=3, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-2-shard-topology-operational 2026-02-08T02:29:59,215 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=3, success=false, followerId=member-2-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 450, snapshotTerm: 3, replicatedToAllIndex: 450 2026-02-08T02:29:59,215 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Leader): Initiating install snapshot to follower member-2-shard-default-operational: follower nextIndex: 0, leader snapshotIndex: 219, leader lastIndex: 243, leader log size: 24 2026-02-08T02:29:59,215 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): follower member-2-shard-inventory-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2026-02-08T02:29:59,216 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=243, lastAppliedTerm=3, lastIndex=243, lastTerm=3, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-2-shard-default-operational 2026-02-08T02:29:59,216 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): Initiating install snapshot to follower member-2-shard-inventory-operational: follower nextIndex: 0, leader snapshotIndex: 450, leader lastIndex: 527, leader log size: 77 2026-02-08T02:29:59,216 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=527, lastAppliedTerm=3, lastIndex=527, lastTerm=3, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-2-shard-inventory-operational 2026-02-08T02:29:59,216 | WARN | opendaylight-cluster-data-shard-dispatcher-33 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=3, success=true, followerId=member-2-shard-toaster-config, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 46988, lastApplied : -1, commitIndex : -1 2026-02-08T02:29:59,217 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Persising snapshot at EntryInfo[index=243, term=3]/EntryInfo[index=243, term=3] 2026-02-08T02:29:59,217 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 219 and term: 3 2026-02-08T02:29:59,217 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Persising snapshot at EntryInfo[index=52, term=3]/EntryInfo[index=52, term=3] 2026-02-08T02:29:59,218 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 50 and term: 3 2026-02-08T02:29:59,222 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: snapshot is durable as of 2026-02-08T02:29:59.217791529Z 2026-02-08T02:29:59,222 | INFO | opendaylight-cluster-data-shard-dispatcher-29 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: snapshot is durable as of 2026-02-08T02:29:59.217617706Z 2026-02-08T02:29:59,227 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=3, success=false, followerId=member-2-shard-topology-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 50, snapshotTerm: 3, replicatedToAllIndex: 50 2026-02-08T02:29:59,227 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=3, success=false, followerId=member-2-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 450, snapshotTerm: 3, replicatedToAllIndex: 450 2026-02-08T02:29:59,227 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Leader): follower member-2-shard-topology-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2026-02-08T02:29:59,227 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): follower member-2-shard-inventory-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2026-02-08T02:29:59,227 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=3, success=false, followerId=member-2-shard-default-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 219, snapshotTerm: 3, replicatedToAllIndex: 219 2026-02-08T02:29:59,227 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Leader): follower member-2-shard-default-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2026-02-08T02:29:59,257 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Persising snapshot at EntryInfo[index=527, term=3]/EntryInfo[index=527, term=3] 2026-02-08T02:29:59,258 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 450 and term: 3 2026-02-08T02:29:59,261 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: snapshot is durable as of 2026-02-08T02:29:59.257909079Z 2026-02-08T02:29:59,295 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Leader): Snapshot successfully installed on follower member-2-shard-topology-operational (last chunk 1) - matchIndex set to 52, nextIndex set to 53 2026-02-08T02:29:59,302 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=4, success=true, followerId=member-2-shard-topology-config, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 47313, lastApplied : -1, commitIndex : -1 2026-02-08T02:29:59,357 | WARN | opendaylight-cluster-data-shard-dispatcher-30 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=3, success=true, followerId=member-2-shard-default-config, logLastIndex=97, logLastTerm=3, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 47129, lastApplied : 97, commitIndex : 97 2026-02-08T02:29:59,357 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Leader): Snapshot successfully installed on follower member-2-shard-default-operational (last chunk 1) - matchIndex set to 243, nextIndex set to 244 2026-02-08T02:29:59,728 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-default-config: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=1}, nanosAgo=128637706556, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-config, generation=2} 2026-02-08T02:29:59,969 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-default-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, nanosAgo=49298794425, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=2} 2026-02-08T02:30:00,052 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): Snapshot successfully installed on follower member-2-shard-inventory-operational (last chunk 3) - matchIndex set to 527, nextIndex set to 528 2026-02-08T02:30:00,988 | INFO | opendaylight-cluster-data-shard-dispatcher-30 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-inventory-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=1}, nanosAgo=72068513778, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=2} 2026-02-08T02:30:12,291 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Check Shards Status After Follower Restart 2026-02-08T02:30:17,248 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Follower Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Data Recovery After Follower Restart 2026-02-08T02:30:35,768 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Follower Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify Flows In Switch After Follower Restart 2026-02-08T02:30:36,245 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Stop Mininet Connected To Follower Node 2026-02-08T02:30:36,624 | INFO | multiThreadIoEventLoopGroup-5-8 | SystemNotificationsListenerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Connection closed by device, Device:/10.30.170.67:46416, NodeId:openflow:1 2026-02-08T02:30:36,625 | INFO | multiThreadIoEventLoopGroup-5-8 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 disconnected. 2026-02-08T02:30:36,626 | INFO | multiThreadIoEventLoopGroup-5-8 | ReconciliationManagerImpl | 303 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.21.2 | Stopping reconciliation for node Uri{value=openflow:1} 2026-02-08T02:30:36,627 | INFO | multiThreadIoEventLoopGroup-5-8 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Publishing node removed notification for Uri{value=openflow:1} 2026-02-08T02:30:36,628 | INFO | multiThreadIoEventLoopGroup-5-8 | ReconciliationManagerImpl | 303 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.21.2 | Stopping reconciliation for node Uri{value=openflow:1} 2026-02-08T02:30:36,628 | INFO | multiThreadIoEventLoopGroup-5-8 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Role SLAVE was granted to device openflow:1 2026-02-08T02:30:36,628 | INFO | multiThreadIoEventLoopGroup-5-8 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping RoleContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:30:36,628 | INFO | multiThreadIoEventLoopGroup-5-8 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping StatisticsContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:30:36,628 | INFO | multiThreadIoEventLoopGroup-5-8 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping running statistics gathering for node openflow:1 2026-02-08T02:30:36,629 | INFO | multiThreadIoEventLoopGroup-5-8 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping RpcContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:30:36,629 | INFO | multiThreadIoEventLoopGroup-5-8 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping DeviceContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:30:36,629 | INFO | multiThreadIoEventLoopGroup-5-8 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Closed clustering services registration for node openflow:1 2026-02-08T02:30:36,629 | INFO | multiThreadIoEventLoopGroup-5-8 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating DeviceContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:30:36,631 | INFO | ofppool-0 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Closed clustering services for node openflow:1 2026-02-08T02:30:36,632 | INFO | multiThreadIoEventLoopGroup-5-8 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating RpcContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:30:36,632 | INFO | multiThreadIoEventLoopGroup-5-8 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating StatisticsContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:30:36,632 | INFO | multiThreadIoEventLoopGroup-5-8 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping running statistics gathering for node openflow:1 2026-02-08T02:30:36,633 | INFO | multiThreadIoEventLoopGroup-5-8 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating RoleContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:30:36,696 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2026-02-08T02:30:36,696 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2026-02-08T02:30:37,201 | INFO | node-cleaner-1 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Try to remove device openflow:1 from operational DS 2026-02-08T02:30:37,756 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=139454, lastAppliedTerm=4, lastIndex=140048, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=594, mandatoryTrim=false] 2026-02-08T02:30:37,758 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=139454, term=4]/EntryInfo[index=140048, term=4] 2026-02-08T02:30:37,758 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 139218 and term: 4 2026-02-08T02:30:37,791 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: snapshot is durable as of 2026-02-08T02:30:37.758268714Z 2026-02-08T02:30:38,973 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Delete All Flows From Follower Node 2026-02-08T02:30:39,254 | INFO | qtp1021661463-508 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Ping Pong Flow Tester Impl 2026-02-08T02:30:39,254 | INFO | qtp1021661463-508 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Transaction Chain Flow Writer Impl 2026-02-08T02:30:39,255 | INFO | ForkJoinPool-9-worker-3 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Number of Txn for dpId: openflow:1 is: 1 2026-02-08T02:30:39,255 | INFO | ForkJoinPool-9-worker-3 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Creating new txChain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@691e8586 for dpid: openflow:1 2026-02-08T02:30:39,280 | INFO | ForkJoinPool-9-worker-3 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed FlowHandlerTask thread for dpid: openflow:1 2026-02-08T02:30:39,314 | INFO | CommitFutures-3 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed all flows installation for: dpid: openflow:1 in 677409558298ns 2026-02-08T02:30:39,314 | INFO | CommitFutures-3 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Transaction chain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@691e8586 closed successfully. 2026-02-08T02:30:40,487 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node After Follower Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow 2Node Cluster.Verify No Flows In Leader Node After Follower Restart 2026-02-08T02:31:00,559 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=159261, lastAppliedTerm=4, lastIndex=160023, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=762, mandatoryTrim=false] 2026-02-08T02:31:00,560 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=159261, term=4]/EntryInfo[index=160023, term=4] 2026-02-08T02:31:00,560 | INFO | opendaylight-cluster-data-shard-dispatcher-28 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 159260 and term: 4 2026-02-08T02:31:00,564 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: snapshot is durable as of 2026-02-08T02:31:00.560391915Z 2026-02-08T02:31:02,003 | INFO | sshd-SshServer[613547ef](port=8101)-nio2-thread-2 | ServerSessionImpl | 126 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.170.170:48990 authenticated 2026-02-08T02:31:02,778 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/030__Cluster_HA_Data_Recovery_BulkFlow_Single_Switch.robot" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/030__Cluster_HA_Data_Recovery_BulkFlow_Single_Switch.robot 2026-02-08T02:31:03,131 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Check Shards Status And Initialize Variables" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Check Shards Status And Initialize Variables 2026-02-08T02:31:07,739 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before Cluster Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before Cluster Restart 2026-02-08T02:31:08,806 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node1" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node1 2026-02-08T02:31:11,457 | INFO | multiThreadIoEventLoopGroup-5-1 | SystemNotificationsListenerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Connection closed by device, Device:/10.30.170.67:60988, NodeId:null 2026-02-08T02:31:11,538 | INFO | multiThreadIoEventLoopGroup-5-2 | ConnectionAdapterImpl | 319 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.21.2 | Hello received 2026-02-08T02:31:11,540 | INFO | multiThreadIoEventLoopGroup-5-2 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 connected. 2026-02-08T02:31:11,540 | INFO | multiThreadIoEventLoopGroup-5-2 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | No context chain found for device: openflow:1, creating new. 2026-02-08T02:31:11,540 | INFO | multiThreadIoEventLoopGroup-5-2 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Device connected to controller, Device:/10.30.170.67:32770, NodeId:Uri{value=openflow:1} 2026-02-08T02:31:11,541 | INFO | multiThreadIoEventLoopGroup-5-2 | RoleContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Started timer for setting SLAVE role on device openflow:1 if no role will be set in 20s. 2026-02-08T02:31:11,605 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-10 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:31:11,668 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower 2026-02-08T02:31:11,677 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-36 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:31:11,677 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting DeviceContextImpl[NEW] service for node openflow:1 2026-02-08T02:31:11,677 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting RpcContextImpl[NEW] service for node openflow:1 2026-02-08T02:31:11,680 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting StatisticsContextImpl[NEW] service for node openflow:1 2026-02-08T02:31:11,680 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting RoleContextImpl[NEW] service for node openflow:1 2026-02-08T02:31:11,681 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | SetRole called with input:SetRoleInput{controllerRole=BECOMEMASTER, node=NodeRef{value=DataObjectIdentifier[ @ urn.opendaylight.inventory.rev130819.Nodes ... nodes.Node[NodeKey{id=Uri{value=openflow:1}}] ]}} 2026-02-08T02:31:11,681 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Requesting state change to BECOMEMASTER 2026-02-08T02:31:11,681 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | RoleChangeTask called on device:openflow:1 OFPRole:BECOMEMASTER 2026-02-08T02:31:11,681 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | getGenerationIdFromDevice called for device: openflow:1 2026-02-08T02:31:11,681 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-38 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Started clustering services for node openflow:1 2026-02-08T02:31:11,691 | INFO | multiThreadIoEventLoopGroup-5-2 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | submitRoleChange called for device:Uri{value=openflow:1}, role:BECOMEMASTER 2026-02-08T02:31:11,693 | INFO | multiThreadIoEventLoopGroup-5-2 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | submitRoleChange onSuccess for device:Uri{value=openflow:1}, role:BECOMEMASTER 2026-02-08T02:31:11,696 | INFO | ofppool-0 | FlowNodeReconciliationImpl | 300 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.21.2 | Triggering reconciliation for device NodeKey{id=Uri{value=openflow:1}} 2026-02-08T02:31:11,704 | INFO | pool-18-thread-1 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 connection is enabled by reconciliation framework. 2026-02-08T02:31:11,707 | INFO | multiThreadIoEventLoopGroup-5-2 | DeviceInitializationUtil | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | IP address of the node openflow:1 is: IpAddress{ipv4Address=Ipv4Address{value=10.30.170.67}} 2026-02-08T02:31:11,707 | INFO | multiThreadIoEventLoopGroup-5-2 | DeviceInitializationUtil | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Port number of the node openflow:1 is: 32770 2026-02-08T02:31:11,710 | INFO | multiThreadIoEventLoopGroup-5-2 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPMETERFEATURES collected 2026-02-08T02:31:11,711 | INFO | multiThreadIoEventLoopGroup-5-2 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPGROUPFEATURES collected 2026-02-08T02:31:11,711 | INFO | multiThreadIoEventLoopGroup-5-2 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPPORTDESC collected 2026-02-08T02:31:11,712 | INFO | multiThreadIoEventLoopGroup-5-2 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 successfully finished collecting 2026-02-08T02:31:11,718 | INFO | opendaylight-cluster-data-notification-dispatcher-43 | ConnectionManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Clearing the device connection timer for the device 1 2026-02-08T02:31:11,718 | INFO | pool-18-thread-1 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 is able to work as master 2026-02-08T02:31:11,718 | INFO | pool-18-thread-1 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Role MASTER was granted to device openflow:1 2026-02-08T02:31:11,718 | INFO | pool-18-thread-1 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Publishing node added notification for Uri{value=openflow:1} 2026-02-08T02:31:11,718 | INFO | pool-18-thread-1 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting statistics gathering for node openflow:1 2026-02-08T02:31:11,998 | INFO | qtp1021661463-506 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Ping Pong Flow Tester Impl 2026-02-08T02:31:11,999 | INFO | qtp1021661463-506 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Transaction Chain Flow Writer Impl 2026-02-08T02:31:11,999 | INFO | ForkJoinPool-9-worker-3 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Number of Txn for dpId: openflow:1 is: 1 2026-02-08T02:31:11,999 | INFO | ForkJoinPool-9-worker-3 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Creating new txChain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@b9ae5f4 for dpid: openflow:1 2026-02-08T02:31:12,019 | INFO | ForkJoinPool-9-worker-3 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed FlowHandlerTask thread for dpid: openflow:1 2026-02-08T02:31:12,048 | INFO | CommitFutures-3 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed all flows installation for: dpid: openflow:1 in 49570272ns 2026-02-08T02:31:12,049 | INFO | CommitFutures-3 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Transaction chain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@b9ae5f4 closed successfully. 2026-02-08T02:31:13,244 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster 2026-02-08T02:31:16,791 | INFO | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Total Flows read: 1000 2026-02-08T02:31:18,403 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Cluster Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Cluster Restart 2026-02-08T02:31:18,802 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill All Cluster Nodes" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill All Cluster Nodes Feb 08, 2026 2:31:58 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Trying to lock /tmp/karaf-0.23.1-SNAPSHOT/lock Feb 08, 2026 2:31:58 AM org.apache.karaf.main.lock.SimpleFileLock lock INFO: Lock acquired Feb 08, 2026 2:31:58 AM org.apache.karaf.main.Main$KarafLockCallback lockAcquired INFO: Lock acquired. Setting startlevel to 100 2026-02-08T02:31:59,236 | INFO | CM Configuration Updater (ManagedService Update: pid=[org.ops4j.pax.logging]) | EventAdminConfigurationNotifier | 5 - org.ops4j.pax.logging.pax-logging-log4j2 - 2.3.0 | Logging configuration changed. (Event Admin service unavailable - no notification sent). 2026-02-08T02:31:59,264 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.aries.blueprint.core/1.10.3 has been started 2026-02-08T02:31:59,364 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Starting JMX OSGi agent 2026-02-08T02:31:59,372 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=c3eff1d2-8adb-43c3-9ca6-f7b51316f143] for service with service.id [15] 2026-02-08T02:31:59,374 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering MBean with ObjectName [osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=c3eff1d2-8adb-43c3-9ca6-f7b51316f143] for service with service.id [40] 2026-02-08T02:31:59,383 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | ROOT | 94 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (94) Starting with globalExtender setting: false 2026-02-08T02:31:59,386 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | ROOT | 94 - org.apache.felix.scr - 2.2.6 | bundle org.apache.felix.scr:2.2.6 (94) Version = 2.2.6 2026-02-08T02:31:59,485 | INFO | activator-1-thread-1 | Activator | 114 - org.apache.karaf.management.server - 4.4.8 | Setting java.rmi.server.hostname system property to 127.0.0.1 2026-02-08T02:31:59,604 | INFO | activator-1-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.cm.ConfigurationAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@23285520 with name osgi.compendium:service=cm,version=1.3,framework=org.eclipse.osgi,uuid=c3eff1d2-8adb-43c3-9ca6-f7b51316f143 2026-02-08T02:31:59,605 | INFO | activator-1-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.wiring.BundleWiringStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@23285520 with name osgi.core:type=wiringState,version=1.1,framework=org.eclipse.osgi,uuid=c3eff1d2-8adb-43c3-9ca6-f7b51316f143 2026-02-08T02:31:59,605 | INFO | activator-1-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.ServiceStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@23285520 with name osgi.core:type=serviceState,version=1.7,framework=org.eclipse.osgi,uuid=c3eff1d2-8adb-43c3-9ca6-f7b51316f143 2026-02-08T02:31:59,606 | INFO | activator-1-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.BundleStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@23285520 with name osgi.core:type=bundleState,version=1.7,framework=org.eclipse.osgi,uuid=c3eff1d2-8adb-43c3-9ca6-f7b51316f143 2026-02-08T02:31:59,607 | INFO | activator-1-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.service.permissionadmin.PermissionAdminMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@23285520 with name osgi.core:service=permissionadmin,version=1.2,framework=org.eclipse.osgi,uuid=c3eff1d2-8adb-43c3-9ca6-f7b51316f143 2026-02-08T02:31:59,607 | INFO | activator-1-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.FrameworkMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@23285520 with name osgi.core:type=framework,version=1.7,framework=org.eclipse.osgi,uuid=c3eff1d2-8adb-43c3-9ca6-f7b51316f143 2026-02-08T02:31:59,607 | INFO | activator-1-thread-1 | core | 84 - org.apache.aries.jmx.core - 1.1.8 | Registering org.osgi.jmx.framework.PackageStateMBean to MBeanServer org.apache.karaf.management.internal.EventAdminMBeanServerWrapper@23285520 with name osgi.core:type=packageState,version=1.5,framework=org.eclipse.osgi,uuid=c3eff1d2-8adb-43c3-9ca6-f7b51316f143 2026-02-08T02:31:59,635 | INFO | activator-1-thread-1 | ServiceComponentRuntimeMBeanImpl | 116 - org.apache.karaf.scr.management - 4.4.8 | Activating the Apache Karaf ServiceComponentRuntime MBean 2026-02-08T02:31:59,641 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.bundle.core/4.4.8 2026-02-08T02:31:59,648 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.config.command/4.4.8 2026-02-08T02:31:59,779 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.deployer.kar/4.4.8 2026-02-08T02:31:59,785 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.diagnostic.core/4.4.8 2026-02-08T02:31:59,801 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.features.command/4.4.8. Missing service: [org.apache.karaf.features.FeaturesService] 2026-02-08T02:31:59,806 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.http.core/4.4.8. Missing service: [org.apache.karaf.http.core.ProxyService] 2026-02-08T02:31:59,812 | INFO | activator-1-thread-2 | Activator | 100 - org.apache.karaf.deployer.features - 4.4.8 | Deployment finished. Registering FeatureDeploymentListener 2026-02-08T02:31:59,822 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.instance.core/4.4.8 2026-02-08T02:31:59,828 | INFO | activator-1-thread-2 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.features.command/4.4.8 2026-02-08T02:31:59,842 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.jaas.command/4.4.8 2026-02-08T02:31:59,845 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.8 2026-02-08T02:31:59,846 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.jaas.command/4.4.8 2026-02-08T02:31:59,848 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.kar.core/4.4.8 2026-02-08T02:31:59,851 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.log.core/4.4.8 2026-02-08T02:31:59,862 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.package.core/4.4.8 2026-02-08T02:31:59,865 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.service.core/4.4.8 2026-02-08T02:31:59,881 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.shell.commands/4.4.8 2026-02-08T02:31:59,881 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Updating commands for bundle org.apache.karaf.shell.commands/4.4.8 2026-02-08T02:31:59,892 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | Activator | 121 - org.apache.karaf.shell.core - 4.4.8 | Not starting local console. To activate set karaf.startLocalConsole=true 2026-02-08T02:31:59,930 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.apache.karaf.shell.core/4.4.8 has been started 2026-02-08T02:31:59,983 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.shell.ssh/4.4.8. Missing service: [org.apache.sshd.server.SshServer] 2026-02-08T02:32:00,010 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.system.core/4.4.8 2026-02-08T02:32:00,026 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.apache.karaf.web.core/4.4.8. Missing service: [org.apache.karaf.web.WebContainerService] 2026-02-08T02:32:00,080 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | Activator | 393 - org.ops4j.pax.web.pax-web-extender-war - 8.0.33 | Configuring WAR extender thread pool. Pool size = 3 2026-02-08T02:32:00,130 | INFO | activator-1-thread-1 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.shell.ssh/4.4.8 2026-02-08T02:32:00,139 | INFO | activator-1-thread-1 | DefaultIoServiceFactoryFactory | 126 - org.apache.sshd.osgi - 2.15.0 | No detected/configured IoServiceFactoryFactory; using Nio2ServiceFactoryFactory 2026-02-08T02:32:00,206 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | Activator | 394 - org.ops4j.pax.web.pax-web-extender-whiteboard - 8.0.33 | Starting Pax Web Whiteboard Extender 2026-02-08T02:32:00,257 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | log | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | Logging initialized @3514ms to org.eclipse.jetty.util.log.Slf4jLog 2026-02-08T02:32:00,275 | INFO | CM Configuration Updater (ManagedService Update: pid=[org.ops4j.pax.web]) | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Scheduling Pax Web reconfiguration because configuration has changed 2026-02-08T02:32:00,276 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | EventAdmin support enabled, WAB events will be posted to EventAdmin topics. 2026-02-08T02:32:00,276 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Pax Web Runtime started 2026-02-08T02:32:00,277 | INFO | paxweb-config-3-thread-1 (change config) | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Scheduling Pax Web reconfiguration because ServerControllerFactory has been registered 2026-02-08T02:32:00,288 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Starting BlueprintBundleTracker 2026-02-08T02:32:00,307 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.apache.karaf.shell.core_4.4.8 [121] was successfully created 2026-02-08T02:32:00,309 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.apache.aries.blueprint.cm_1.3.2 [79] was successfully created 2026-02-08T02:32:00,310 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.apache.aries.blueprint.core_1.10.3 [80] was successfully created 2026-02-08T02:32:00,329 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Configuring server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2026-02-08T02:32:00,330 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Configuring JettyServerController{configuration=0603ebcd-67f2-42e0-9017-69bb8f5410d3,state=UNCONFIGURED} 2026-02-08T02:32:00,330 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating Jetty server instance using configuration properties. 2026-02-08T02:32:00,351 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Processing Jetty configuration from files: [etc/jetty.xml] 2026-02-08T02:32:00,479 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Found configured connector "jetty-default": 0.0.0.0:8181 2026-02-08T02:32:00,480 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Using configured jetty-default@5c9823e1{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} as non secure connector for address: 0.0.0.0:8181 2026-02-08T02:32:00,481 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Eagerly starting Jetty thread pool QueuedThreadPool[qtp1455505446]@56c14026{STOPPED,0<=0<=200,i=0,r=-1,q=0}[NO_TRY] 2026-02-08T02:32:00,484 | INFO | paxweb-config-3-thread-1 (change controller) | JettyFactory | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding JMX support to Jetty server 2026-02-08T02:32:00,501 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Starting server controller org.ops4j.pax.web.service.jetty.internal.JettyServerController 2026-02-08T02:32:00,501 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting JettyServerController{configuration=0603ebcd-67f2-42e0-9017-69bb8f5410d3,state=STOPPED} 2026-02-08T02:32:00,502 | INFO | paxweb-config-3-thread-1 (change controller) | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Server@35ee5f83{STOPPED}[9.4.57.v20241219] 2026-02-08T02:32:00,503 | INFO | paxweb-config-3-thread-1 (change controller) | Server | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | jetty-9.4.57.v20241219; built: 2025-01-08T21:24:30.412Z; git: df524e6b29271c2e09ba9aea83c18dc9db464a31; jvm 21.0.9+10-Ubuntu-122.04 2026-02-08T02:32:00,526 | INFO | paxweb-config-3-thread-1 (change controller) | session | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | DefaultSessionIdManager workerName=node0 2026-02-08T02:32:00,526 | INFO | paxweb-config-3-thread-1 (change controller) | session | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | No SessionScavenger set, using defaults 2026-02-08T02:32:00,528 | INFO | paxweb-config-3-thread-1 (change controller) | session | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | node0 Scavenging every 600000ms 2026-02-08T02:32:00,559 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.jdbc.core/4.4.8 2026-02-08T02:32:00,583 | INFO | paxweb-config-3-thread-1 (change controller) | AbstractConnector | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started jetty-default@5c9823e1{HTTP/1.1, (http/1.1)}{0.0.0.0:8181} 2026-02-08T02:32:00,584 | INFO | paxweb-config-3-thread-1 (change controller) | Server | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started @3849ms 2026-02-08T02:32:00,589 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering HttpService factory 2026-02-08T02:32:00,591 | INFO | paxweb-config-3-thread-1 (change controller) | StoppableHttpServiceFactory | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.apache.karaf.http.core_4.4.8 [106]] 2026-02-08T02:32:00,609 | INFO | paxweb-config-3-thread-1 (change controller) | StoppableHttpServiceFactory | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.apache.karaf.web.core_4.4.8 [125]] 2026-02-08T02:32:00,616 | INFO | activator-1-thread-2 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.web.core/4.4.8 2026-02-08T02:32:00,625 | INFO | paxweb-config-3-thread-1 (change controller) | Activator | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering HttpServiceRuntime 2026-02-08T02:32:00,625 | INFO | HttpService->WarExtender (add HttpService) | StoppableHttpServiceFactory | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-war_8.0.33 [393]] 2026-02-08T02:32:00,629 | INFO | HttpService->Whiteboard (add HttpService) | StoppableHttpServiceFactory | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.ops4j.pax.web.pax-web-extender-whiteboard_8.0.33 [394]] 2026-02-08T02:32:00,645 | INFO | paxweb-config-3-thread-1 | ServerModel | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-3,contextPath='/'} 2026-02-08T02:32:00,645 | INFO | paxweb-config-3-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)}", size=2} 2026-02-08T02:32:00,646 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-3,contextPath='/'} 2026-02-08T02:32:00,647 | INFO | activator-1-thread-2 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.apache.karaf.http.core/4.4.8 2026-02-08T02:32:00,684 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-1,name='default',path='/',bundle=org.ops4j.pax.web.pax-web-extender-whiteboard,context=(supplier)} to o.o.p.w.s.j.i.PaxWebServletContextHandler@2c204b5a{/,null,STOPPED} 2026-02-08T02:32:00,686 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@2c204b5a{/,null,STOPPED} 2026-02-08T02:32:00,710 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | StoppableHttpServiceFactory | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.jolokia.osgi_1.7.2 [156]] 2026-02-08T02:32:00,719 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@3aea5995,contexts=[{HS,OCM-5,context:125105035,/}]} 2026-02-08T02:32:00,720 | INFO | paxweb-config-3-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@3aea5995,contexts=null}", size=3} 2026-02-08T02:32:00,720 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{HS,id=OCM-5,name='context:125105035',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [156],contextId='context:125105035',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@774f38b}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@2c204b5a{/,null,STOPPED} 2026-02-08T02:32:00,721 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@2c204b5a{/,null,STOPPED} 2026-02-08T02:32:00,722 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-4,name='org.jolokia.osgi.servlet.JolokiaServlet',alias='/jolokia',urlPatterns=[/jolokia/*],servlet=org.jolokia.osgi.servlet.JolokiaServlet@3aea5995,contexts=[{HS,OCM-5,context:125105035,/}]} 2026-02-08T02:32:00,729 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/" with default Osgi Context OsgiContextModel{HS,id=OCM-5,name='context:125105035',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [156],contextId='context:125105035',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@774f38b}} 2026-02-08T02:32:00,749 | INFO | paxweb-config-3-thread-1 | osgi | 156 - org.jolokia.osgi - 1.7.2 | No access restrictor found, access to any MBean is allowed 2026-02-08T02:32:00,774 | INFO | paxweb-config-3-thread-1 | ContextHandler | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@2c204b5a{/,null,AVAILABLE} 2026-02-08T02:32:00,774 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{HS,id=OCM-5,name='context:125105035',path='/',bundle=org.jolokia.osgi,context=WebContainerContextWrapper{bundle=org.jolokia.osgi_1.7.2 [156],contextId='context:125105035',delegate=org.jolokia.osgi.security.ServiceAuthenticationHttpContext@774f38b}}} as OSGi service for "/" context path 2026-02-08T02:32:00,859 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.22.3 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService)] 2026-02-08T02:32:00,886 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CustomFilterAdapterConfigurationImpl | 167 - org.opendaylight.aaa.filterchain - 0.22.3 | Custom filter properties updated: {service.pid=org.opendaylight.aaa.filterchain, osgi.ds.satisfying.condition.target=(osgi.condition.id=true), customFilterList=, component.name=org.opendaylight.aaa.filterchain.configuration.impl.CustomFilterAdapterConfigurationImpl, felix.fileinstall.filename=file:/tmp/karaf-0.23.1-SNAPSHOT/etc/org.opendaylight.aaa.filterchain.cfg, component.id=4, Filter.target=(org.opendaylight.aaa.filterchain.filter=true)} 2026-02-08T02:32:00,917 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.web.servlet.ServletSupport), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.aaa.api.AuthenticationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.aaa.api.IIDMStore)] 2026-02-08T02:32:00,933 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | StoppableHttpServiceFactory | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.opendaylight.aaa.shiro_0.22.3 [173]] 2026-02-08T02:32:00,935 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2026-02-08T02:32:00,935 | INFO | paxweb-config-3-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]}", size=1} 2026-02-08T02:32:00,935 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-8,name='MoonTokenEndpoint',urlPatterns=[/moon],contexts=[{WB,OCM-1,default,/}]} 2026-02-08T02:32:00,942 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.web.servlet.ServletSupport), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.aaa.api.IIDMStore)] 2026-02-08T02:32:00,956 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.aaa.api.IIDMStore)] 2026-02-08T02:32:00,979 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | FileAkkaConfigurationReader | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | File-based Pekko configuration reader enabled 2026-02-08T02:32:00,993 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiActorSystemProvider | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Actor System provider starting 2026-02-08T02:32:01,163 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | ActorSystemProviderImpl | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Creating new ActorSystem 2026-02-08T02:32:01,504 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Slf4jLogger | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Slf4jLogger started 2026-02-08T02:32:01,739 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ArteryTransport | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Remoting started with transport [Artery tcp]; listening on address [pekko://opendaylight-cluster-data@10.30.170.226:2550] with UID [755719854801093453] 2026-02-08T02:32:01,749 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Starting up, Pekko version [1.2.1] ... 2026-02-08T02:32:01,795 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Registered cluster JMX MBean [pekko:type=Cluster] 2026-02-08T02:32:01,805 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Started up successfully 2026-02-08T02:32:01,840 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR started. Config: strategy [KeepMajority], stable-after [7 seconds], down-all-when-unstable [5250 milliseconds], selfUniqueAddress [pekko://opendaylight-cluster-data@10.30.170.226:2550#755719854801093453], selfDc [default]. 2026-02-08T02:32:02,027 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiActorSystemProvider | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Actor System provider started 2026-02-08T02:32:02,045 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | FileModuleShardConfigProvider | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Shard configuration provider started 2026-02-08T02:32:02,113 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.infrautils.diagstatus-shell/7.1.9. Missing service: [org.opendaylight.infrautils.diagstatus.DiagStatusServiceMBean] 2026-02-08T02:32:02,155 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.87:2550], control stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.87/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:32:02,155 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.87:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.87/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:32:02,222 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | KarafSystemReady | 242 - org.opendaylight.infrautils.ready-impl - 7.1.9 | ThreadFactory for SystemReadyService created 2026-02-08T02:32:02,224 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | KarafSystemReady | 242 - org.opendaylight.infrautils.ready-impl - 7.1.9 | Now starting to provide full system readiness status updates (see TestBundleDiag's logs)... 2026-02-08T02:32:02,230 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | DiagStatusServiceImpl | 239 - org.opendaylight.infrautils.diagstatus-impl - 7.1.9 | Diagnostic Status Service started 2026-02-08T02:32:02,234 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | MBeanUtils | 238 - org.opendaylight.infrautils.diagstatus-api - 7.1.9 | MBean registration for org.opendaylight.infrautils.diagstatus:type=SvcStatus SUCCESSFUL. 2026-02-08T02:32:02,235 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | DiagStatusServiceMBeanImpl | 239 - org.opendaylight.infrautils.diagstatus-impl - 7.1.9 | Diagnostic Status Service management started 2026-02-08T02:32:02,235 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.infrautils.diagstatus-shell/7.1.9 2026-02-08T02:32:02,230 | INFO | SystemReadyService-0 | KarafSystemReady | 242 - org.opendaylight.infrautils.ready-impl - 7.1.9 | checkBundleDiagInfos() started... 2026-02-08T02:32:02,279 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.mastership.MastershipChangeServiceManager), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.api.openflow.FlowGroupCacheManager), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService)] 2026-02-08T02:32:02,295 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.RpcService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService)] 2026-02-08T02:32:02,338 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.21.2. Missing service: [org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager] 2026-02-08T02:32:02,346 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#124400564]], but this node is not initialized yet 2026-02-08T02:32:02,358 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.NotificationService), (objectClass=org.opendaylight.mdsal.binding.api.NotificationPublishService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2026-02-08T02:32:02,366 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoinNack message from [Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/system/cluster/core/daemon#1739600931]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2026-02-08T02:32:02,406 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.21.2 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationServiceFactory)] 2026-02-08T02:32:02,419 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.21.2 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer))] 2026-02-08T02:32:02,420 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (objectClass=org.opendaylight.openflowplugin.applications.reconciliation.ReconciliationManager), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.api.openflow.FlowGroupCacheManager), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService)] 2026-02-08T02:32:02,421 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.api.openflow.FlowGroupCacheManager), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService)] 2026-02-08T02:32:02,425 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | ReconciliationManagerImpl | 303 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.21.2 | ReconciliationManager started 2026-02-08T02:32:02,426 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.applications.reconciliation-framework/0.21.2 2026-02-08T02:32:02,427 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.serviceutils.srm.ServiceRecoveryRegistry), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService)] 2026-02-08T02:32:02,430 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | MessageIntelligenceAgencyImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Registered MBean org.opendaylight.openflowplugin.impl.statistics.ofpspecific:type=MessageIntelligenceAgencyMXBean 2026-02-08T02:32:02,432 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.impl/0.21.2 2026-02-08T02:32:02,455 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.applications.frm.recovery.OpenflowServiceRecoveryHandler), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService)] 2026-02-08T02:32:02,456 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.RpcService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService)] 2026-02-08T02:32:02,457 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OpenflowServiceRecoveryHandlerImpl | 300 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.21.2 | Registering openflowplugin service recovery handlers 2026-02-08T02:32:02,461 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Command registration delayed for bundle org.opendaylight.openflowplugin.srm-shell/0.21.2. Missing service: [org.opendaylight.serviceutils.srm.spi.RegistryControl, org.opendaylight.mdsal.binding.api.DataBroker] 2026-02-08T02:32:02,466 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | SimpleBindingDOMCodecFactory | 326 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.20 | Binding/DOM Codec enabled 2026-02-08T02:32:02,471 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiBindingDOMCodec | 327 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.20 | Binding/DOM Codec activated 2026-02-08T02:32:02,481 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | DefaultBindingRuntimeGenerator | 329 - org.opendaylight.yangtools.binding-generator - 14.0.20 | Binding/YANG type support activated 2026-02-08T02:32:02,490 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiBindingRuntime | 334 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.20 | Binding Runtime activated 2026-02-08T02:32:02,546 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiModelRuntime | 334 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.20 | Model Runtime starting 2026-02-08T02:32:02,569 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | KarafFeaturesSupport | 334 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.20 | Will attempt to integrate with Karaf FeaturesService 2026-02-08T02:32:03,041 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | NettyTransportSupport | 284 - org.opendaylight.netconf.transport-api - 10.0.2 | Netty transport backed by epoll(2) 2026-02-08T02:32:03,229 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoinNack message from [Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/system/cluster/core/daemon#-113206415]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2026-02-08T02:32:03,244 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] is JOINING itself (with roles [member-1, dc-default], version [0.0.0]) and forming new cluster 2026-02-08T02:32:03,246 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - is the new leader among reachable nodes (more leaders may exist) 2026-02-08T02:32:03,255 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.170.226:2550] to [Up] 2026-02-08T02:32:03,263 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | This node is now the leader responsible for taking SBR decisions among the reachable nodes (more leaders may exist). 2026-02-08T02:32:03,266 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-942546190]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2026-02-08T02:32:03,266 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.170.226:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#-942546190]] (version [1.2.1]) 2026-02-08T02:32:03,268 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | SharedEffectiveModelContextFactory | 380 - org.opendaylight.yangtools.yang-parser-impl - 14.0.20 | Using weak references 2026-02-08T02:32:03,368 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Node [pekko://opendaylight-cluster-data@10.30.171.87:2550] is JOINING, roles [member-3, dc-default], version [0.0.0] 2026-02-08T02:32:03,863 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.171.87:2550] to [Up] 2026-02-08T02:32:05,278 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiModuleInfoSnapshotImpl | 334 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.20 | EffectiveModelContext generation 1 activated 2026-02-08T02:32:05,279 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiDOMSchemaService | 252 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 15.0.2 | DOM Schema services activated 2026-02-08T02:32:05,279 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiDOMSchemaService | 252 - org.opendaylight.mdsal.mdsal-dom-schema-osgi - 15.0.2 | Updating context to generation 1 2026-02-08T02:32:05,282 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | DOMRpcRouter | 251 - org.opendaylight.mdsal.mdsal-dom-broker - 15.0.2 | DOM RPC/Action router started 2026-02-08T02:32:05,288 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiRemoteOpsProvider | 196 - org.opendaylight.controller.sal-remoterpc-connector - 12.0.3 | Remote Operations service starting 2026-02-08T02:32:05,290 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiRemoteOpsProvider | 196 - org.opendaylight.controller.sal-remoterpc-connector - 12.0.3 | Remote Operations service started 2026-02-08T02:32:06,002 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiBindingRuntimeContextImpl | 334 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.20 | BindingRuntimeContext generation 1 activated 2026-02-08T02:32:06,019 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiBindingDOMCodecServicesImpl | 327 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.20 | Binding/DOM Codec generation 1 activated 2026-02-08T02:32:06,020 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiDatastoreContextIntrospectorFactory | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Datastore Context Introspector activated 2026-02-08T02:32:06,022 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiDistributedDataStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Distributed Datastore type CONFIGURATION starting 2026-02-08T02:32:06,312 | WARN | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | DatastoreContext | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Cannot find policy , will stick with normal 2026-02-08T02:32:06,317 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | DistributedDataStoreFactory | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Create data store instance of type : config 2026-02-08T02:32:06,318 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | AbstractModuleShardConfigProvider | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Config file exists - reading config from it 2026-02-08T02:32:06,319 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | AbstractModuleShardConfigProvider | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Config file exists - reading config from it 2026-02-08T02:32:06,323 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | AbstractDataStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Creating ShardManager : shardmanager-config 2026-02-08T02:32:06,350 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Starting ShardManager shard-manager-config 2026-02-08T02:32:06,351 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Recovery complete 2026-02-08T02:32:06,367 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | DistributedDataStoreFactory | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Data store config is using tell-based protocol 2026-02-08T02:32:06,370 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | AbstractModuleShardConfigProvider | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Config file exists - reading config from it 2026-02-08T02:32:06,370 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | AbstractModuleShardConfigProvider | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Config file exists - reading config from it 2026-02-08T02:32:06,371 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiDistributedDataStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Distributed Datastore type OPERATIONAL starting 2026-02-08T02:32:06,372 | WARN | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | DatastoreContext | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Cannot find policy , will stick with normal 2026-02-08T02:32:06,373 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | DistributedDataStoreFactory | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Create data store instance of type : operational 2026-02-08T02:32:06,373 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | AbstractDataStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Creating ShardManager : shardmanager-operational 2026-02-08T02:32:06,378 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.226:2550 2026-02-08T02:32:06,379 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-default-config 2026-02-08T02:32:06,379 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-topology-config 2026-02-08T02:32:06,379 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-inventory-config 2026-02-08T02:32:06,379 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-toaster-config 2026-02-08T02:32:06,385 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Starting ShardManager shard-manager-operational 2026-02-08T02:32:06,386 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Recovery complete 2026-02-08T02:32:06,386 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-1}, address: pekko://opendaylight-cluster-data@10.30.170.226:2550 2026-02-08T02:32:06,387 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-default-operational 2026-02-08T02:32:06,387 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-topology-operational 2026-02-08T02:32:06,387 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2026-02-08T02:32:06,387 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2026-02-08T02:32:06,387 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | DistributedDataStoreFactory | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Data store operational is using tell-based protocol 2026-02-08T02:32:06,388 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.87:2550 2026-02-08T02:32:06,388 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-default-operational 2026-02-08T02:32:06,388 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-topology-operational 2026-02-08T02:32:06,388 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2026-02-08T02:32:06,388 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2026-02-08T02:32:06,391 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | GlobalBindingDOMCodecServices | 327 - org.opendaylight.yangtools.binding-data-codec-osgi - 14.0.20 | Global Binding/DOM Codec activated with generation 1 2026-02-08T02:32:06,399 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiBlockingBindingNormalizer | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter activated 2026-02-08T02:32:06,385 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.87:2550 2026-02-08T02:32:06,400 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-default-config 2026-02-08T02:32:06,400 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-topology-config 2026-02-08T02:32:06,400 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-inventory-config 2026-02-08T02:32:06,400 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-toaster-config 2026-02-08T02:32:06,402 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-default-config 2026-02-08T02:32:06,403 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-default-config 2026-02-08T02:32:06,403 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-topology-config 2026-02-08T02:32:06,403 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-topology-config 2026-02-08T02:32:06,404 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-inventory-config 2026-02-08T02:32:06,404 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-inventory-config 2026-02-08T02:32:06,405 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-toaster-config 2026-02-08T02:32:06,405 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-toaster-config 2026-02-08T02:32:06,413 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-topology-config: Shard created, persistent : true 2026-02-08T02:32:06,415 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-toaster-config: Shard created, persistent : true 2026-02-08T02:32:06,416 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-inventory-config: Shard created, persistent : true 2026-02-08T02:32:06,416 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-default-config: Shard created, persistent : true 2026-02-08T02:32:06,418 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-default-operational 2026-02-08T02:32:06,418 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-default-operational 2026-02-08T02:32:06,416 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-default-operational: Shard created, persistent : false 2026-02-08T02:32:06,420 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-topology-operational 2026-02-08T02:32:06,420 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-topology-operational 2026-02-08T02:32:06,421 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-inventory-operational 2026-02-08T02:32:06,421 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2026-02-08T02:32:06,422 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-1-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-toaster-operational 2026-02-08T02:32:06,422 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2026-02-08T02:32:06,422 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-topology-operational: Shard created, persistent : false 2026-02-08T02:32:06,425 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for MountPointService activated 2026-02-08T02:32:06,433 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | DOMNotificationRouter | 251 - org.opendaylight.mdsal.mdsal-dom-broker - 15.0.2 | DOM Notification Router started 2026-02-08T02:32:06,436 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.NotificationPublishService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2026-02-08T02:32:06,437 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-toaster-operational: Shard created, persistent : false 2026-02-08T02:32:06,437 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for NotificationService activated 2026-02-08T02:32:06,436 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-inventory-operational: Shard created, persistent : false 2026-02-08T02:32:06,440 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.eos.binding.api.EntityOwnershipService), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2026-02-08T02:32:06,441 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for NotificationPublishService activated 2026-02-08T02:32:06,442 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService)] 2026-02-08T02:32:06,442 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.21.2 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.mdsal.binding.api.RpcProviderService)] 2026-02-08T02:32:06,444 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-default-config/member-1-shard-default-config-notifier#-1368059207 created and ready for shard:member-1-shard-default-config 2026-02-08T02:32:06,444 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-toaster-config/member-1-shard-toaster-config-notifier#448851135 created and ready for shard:member-1-shard-toaster-config 2026-02-08T02:32:06,444 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-inventory-config/member-1-shard-inventory-config-notifier#-2105868954 created and ready for shard:member-1-shard-inventory-config 2026-02-08T02:32:06,444 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-inventory-operational/member-1-shard-inventory-operational-notifier#-1439980103 created and ready for shard:member-1-shard-inventory-operational 2026-02-08T02:32:06,444 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-config/member-1-shard-topology-config/member-1-shard-topology-config-notifier#1739638938 created and ready for shard:member-1-shard-topology-config 2026-02-08T02:32:06,443 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for RpcService activated 2026-02-08T02:32:06,445 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-toaster-operational/member-1-shard-toaster-operational-notifier#1201277913 created and ready for shard:member-1-shard-toaster-operational 2026-02-08T02:32:06,445 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-default-operational/member-1-shard-default-operational-notifier#-1427631419 created and ready for shard:member-1-shard-default-operational 2026-02-08T02:32:06,446 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.22.3 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2026-02-08T02:32:06,446 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier:pekko://opendaylight-cluster-data@10.30.170.226:2550/user/shardmanager-operational/member-1-shard-topology-operational/member-1-shard-topology-operational-notifier#-491939401 created and ready for shard:member-1-shard-topology-operational 2026-02-08T02:32:06,448 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Starting recovery with journal batch size 1 2026-02-08T02:32:06,448 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: Starting recovery with journal batch size 1 2026-02-08T02:32:06,448 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: Starting recovery with journal batch size 1 2026-02-08T02:32:06,449 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: Starting recovery with journal batch size 1 2026-02-08T02:32:06,449 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: Starting recovery with journal batch size 1 2026-02-08T02:32:06,448 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Starting recovery with journal batch size 1 2026-02-08T02:32:06,450 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Starting recovery with journal batch size 1 2026-02-08T02:32:06,451 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for RpcProviderService activated 2026-02-08T02:32:06,456 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | RaftActor | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Starting recovery with journal batch size 1 2026-02-08T02:32:06,510 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Singleton manager starting singleton actor [pekko://opendaylight-cluster-data/system/singletonManagerOwnerSupervisor/OwnerSupervisor] 2026-02-08T02:32:06,511 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClusterSingletonManager | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | ClusterSingletonManager state change [Start -> Oldest] 2026-02-08T02:32:06,522 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2026-02-08T02:32:06,522 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.21.2 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.applications.deviceownershipservice.DeviceOwnershipService), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2026-02-08T02:32:06,524 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for ActionService activated 2026-02-08T02:32:06,526 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for ActionProviderService activated 2026-02-08T02:32:06,526 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | DynamicBindingAdapter | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | 8 DOMService trackers started 2026-02-08T02:32:06,527 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.21.2 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.mdsal.binding.api.DataBroker)] 2026-02-08T02:32:06,527 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.mdsal.binding.api.DataBroker), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.yangtools.binding.data.codec.api.BindingNormalizedNodeSerializer)), (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2026-02-08T02:32:06,529 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | GlobalBindingRuntimeContext | 334 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.20 | Global BindingRuntimeContext generation 1 activated 2026-02-08T02:32:06,531 | INFO | Start Level: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | OSGiModelRuntime | 334 - org.opendaylight.yangtools.binding-runtime-osgi - 14.0.20 | Model Runtime started 2026-02-08T02:32:06,537 | INFO | Framework Event Dispatcher: Equinox Container: c3eff1d2-8adb-43c3-9ca6-f7b51316f143 | Main | 4 - org.ops4j.pax.logging.pax-logging-api - 2.3.0 | Karaf started in 8s. Bundle stats: 396 active, 397 total 2026-02-08T02:32:06,586 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: journal open: applyTo=0 2026-02-08T02:32:06,586 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: journal open: applyTo=0 2026-02-08T02:32:06,586 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: journal open: applyTo=0 2026-02-08T02:32:06,592 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: journal open: applyTo=0 2026-02-08T02:32:06,592 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: journal open: applyTo=0 2026-02-08T02:32:06,593 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: journal open: applyTo=0 2026-02-08T02:32:06,602 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: Recovery completed in in 1.765 ms: last log index = -1, last log term = -1, napshot index = -1, snapshot term = -1, journal size = 0 2026-02-08T02:32:06,604 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: Recovery completed in in 4.312 ms: last log index = -1, last log term = -1, napshot index = -1, snapshot term = -1, journal size = 0 2026-02-08T02:32:06,604 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: Recovery completed in in 4.242 ms: last log index = -1, last log term = -1, napshot index = -1, snapshot term = -1, journal size = 0 2026-02-08T02:32:06,611 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from null to Follower 2026-02-08T02:32:06,612 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-operational , received role change from null to Follower 2026-02-08T02:32:06,607 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: journal open: applyTo=120 2026-02-08T02:32:06,615 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Recovery completed in in 11.13 ms: last log index = -1, last log term = -1, napshot index = -1, snapshot term = -1, journal size = 0 2026-02-08T02:32:06,616 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Recovery completed in in 16.03 ms: last log index = -1, last log term = -1, napshot index = -1, snapshot term = -1, journal size = 0 2026-02-08T02:32:06,616 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-operational , received role change from null to Follower 2026-02-08T02:32:06,616 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-operational , received role change from null to Follower 2026-02-08T02:32:06,616 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-config , received role change from null to Follower 2026-02-08T02:32:06,616 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Recovery completed in in 17.12 ms: last log index = -1, last log term = -1, napshot index = -1, snapshot term = -1, journal size = 0 2026-02-08T02:32:06,617 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: Peer address for peer member-3-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-topology-config 2026-02-08T02:32:06,617 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Peer address for peer member-3-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2026-02-08T02:32:06,617 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: Peer address for peer member-3-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2026-02-08T02:32:06,617 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Peer address for peer member-3-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-default-operational 2026-02-08T02:32:06,618 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2026-02-08T02:32:06,618 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from null to Follower 2026-02-08T02:32:06,618 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2026-02-08T02:32:06,618 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2026-02-08T02:32:06,618 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Peer address for peer member-3-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-topology-operational 2026-02-08T02:32:06,618 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2026-02-08T02:32:06,619 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-operational , registered listener pekko://opendaylight-cluster-data/user/shardmanager-operational 2026-02-08T02:32:06,619 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-toaster-operational from null to Follower 2026-02-08T02:32:06,619 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-topology-config from null to Follower 2026-02-08T02:32:06,619 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from null to Follower 2026-02-08T02:32:06,619 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-default-operational from null to Follower 2026-02-08T02:32:06,620 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-topology-operational from null to Follower 2026-02-08T02:32:06,620 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: Peer address for peer member-3-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-toaster-config 2026-02-08T02:32:06,620 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2026-02-08T02:32:06,621 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-toaster-config from null to Follower 2026-02-08T02:32:06,685 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: Recovery completed in in 72.27 ms: last log index = 119, last log term = 3, napshot index = -1, snapshot term = -1, journal size = 120 2026-02-08T02:32:06,687 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-config , received role change from null to Follower 2026-02-08T02:32:06,687 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: Peer address for peer member-3-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-default-config 2026-02-08T02:32:06,688 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2026-02-08T02:32:06,688 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-default-config from null to Follower 2026-02-08T02:32:06,809 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | EnabledRaftStorage | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: journal open: applyTo=166063 2026-02-08T02:32:07,098 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Recovery | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Recovery completed in in 288.4 ms: last log index = 166062, last log term = 4, napshot index = 159261, snapshot term = 4, journal size = 6801 2026-02-08T02:32:07,102 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-config , received role change from null to Follower 2026-02-08T02:32:07,102 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Peer address for peer member-3-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-inventory-config 2026-02-08T02:32:07,103 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-config , registered listener pekko://opendaylight-cluster-data/user/shardmanager-config 2026-02-08T02:32:07,104 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-inventory-config from null to Follower 2026-02-08T02:32:07,503 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClusterSingletonProxy | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Singleton identified at [pekko://opendaylight-cluster-data/system/singletonManagerOwnerSupervisor/OwnerSupervisor] 2026-02-08T02:32:15,410 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#124400564]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2026-02-08T02:32:15,411 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.170.226:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#124400564]] (version [1.2.1]) 2026-02-08T02:32:15,463 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Node [pekko://opendaylight-cluster-data@10.30.170.53:2550] is JOINING, roles [member-2, dc-default], version [0.0.0] 2026-02-08T02:32:15,465 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-2126599309] was unhandled. [1] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:32:15,466 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | LocalActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ClusterEvent$MemberJoined] to Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#2004727713] was unhandled. [2] dead letters encountered. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:32:16,102 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.170.53:2550] to [Up] 2026-02-08T02:32:16,103 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:32:16,103 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:32:16,103 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational 2026-02-08T02:32:16,103 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config 2026-02-08T02:32:16,103 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational 2026-02-08T02:32:16,104 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-topology-config 2026-02-08T02:32:16,104 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2026-02-08T02:32:16,104 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2026-02-08T02:32:16,104 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config 2026-02-08T02:32:16,104 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-toaster-config 2026-02-08T02:32:16,104 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Peer address for peer member-2-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational 2026-02-08T02:32:16,105 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: Peer address for peer member-2-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config 2026-02-08T02:32:16,105 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Peer address for peer member-2-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational 2026-02-08T02:32:16,105 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Peer address for peer member-2-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config 2026-02-08T02:32:16,105 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Peer address for peer member-2-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2026-02-08T02:32:16,105 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: Peer address for peer member-2-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2026-02-08T02:32:16,105 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: Peer address for peer member-2-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-toaster-config 2026-02-08T02:32:16,105 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: Peer address for peer member-2-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-topology-config 2026-02-08T02:32:16,482 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational (Follower): Term 4 in "RequestVote{term=4, candidateId=member-2-shard-toaster-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 3 - updating term 2026-02-08T02:32:16,510 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@1aca8ab7 2026-02-08T02:32:16,512 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-toaster-operational status sync done false 2026-02-08T02:32:16,519 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Follower): Term 4 in "RequestVote{term=4, candidateId=member-2-shard-topology-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 3 - updating term 2026-02-08T02:32:16,527 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@31d2f263 2026-02-08T02:32:16,528 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-topology-operational status sync done false 2026-02-08T02:32:16,550 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Follower): Term 5 in "RequestVote{term=5, candidateId=member-2-shard-topology-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 4 - updating term 2026-02-08T02:32:16,561 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Follower): Term 4 in "RequestVote{term=4, candidateId=member-2-shard-default-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 3 - updating term 2026-02-08T02:32:16,562 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Follower): Term 4 in "RequestVote{term=4, candidateId=member-2-shard-toaster-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 3 - updating term 2026-02-08T02:32:16,563 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@af90283 2026-02-08T02:32:16,564 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-topology-config status sync done false 2026-02-08T02:32:16,569 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Follower): Term 4 in "RequestVote{term=4, candidateId=member-2-shard-inventory-operational, lastLogIndex=-1, lastLogTerm=-1}" message is greater than follower's term 3 - updating term 2026-02-08T02:32:16,574 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@12366a61 2026-02-08T02:32:16,574 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@1dfdb50d 2026-02-08T02:32:16,574 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-toaster-config status sync done false 2026-02-08T02:32:16,575 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-default-operational status sync done false 2026-02-08T02:32:16,577 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@6f849138 2026-02-08T02:32:16,578 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: All Shards are ready - data store operational is ready 2026-02-08T02:32:16,578 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-inventory-operational status sync done false 2026-02-08T02:32:16,581 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OSGiDOMStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Datastore service type OPERATIONAL activated 2026-02-08T02:32:16,581 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | OSGiDistributedDataStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Distributed Datastore type OPERATIONAL started 2026-02-08T02:32:16,639 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Follower): Term 4 in "RequestVote{term=4, candidateId=member-2-shard-default-config, lastLogIndex=119, lastLogTerm=3}" message is greater than follower's term 3 - updating term 2026-02-08T02:32:16,652 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@3aef7c2 2026-02-08T02:32:16,653 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-default-config status sync done false 2026-02-08T02:32:16,656 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-default-config status sync done true 2026-02-08T02:32:16,989 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config (Follower): Term 5 in "RequestVote{term=5, candidateId=member-2-shard-inventory-config, lastLogIndex=166062, lastLogTerm=4}" message is greater than follower's term 4 - updating term 2026-02-08T02:32:17,001 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@418b1264 2026-02-08T02:32:17,002 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: All Shards are ready - data store config is ready 2026-02-08T02:32:17,002 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done false 2026-02-08T02:32:17,005 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | OSGiDOMStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Datastore service type CONFIGURATION activated 2026-02-08T02:32:17,006 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-inventory-config status sync done true 2026-02-08T02:32:17,024 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-toaster-operational status sync done true 2026-02-08T02:32:17,028 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | OSGiClusterAdmin | 192 - org.opendaylight.controller.sal-cluster-admin-impl - 12.0.3 | Cluster Admin services started 2026-02-08T02:32:17,039 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | ConcurrentDOMDataBroker | 359 - org.opendaylight.yangtools.util - 14.0.20 | ThreadFactory created: CommitFutures 2026-02-08T02:32:17,041 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | DataBrokerCommitExecutor | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | DOM Data Broker commit exector started 2026-02-08T02:32:17,043 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | ConcurrentDOMDataBroker | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | DOM Data Broker started 2026-02-08T02:32:17,047 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractAdaptedService | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding/DOM adapter for DataBroker activated 2026-02-08T02:32:17,046 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.22.3 is waiting for dependencies [(&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.encrypt.AAAEncryptionService)] 2026-02-08T02:32:17,049 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-topology-operational status sync done true 2026-02-08T02:32:17,075 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-topology-config status sync done true 2026-02-08T02:32:17,087 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-default-operational status sync done true 2026-02-08T02:32:17,088 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-toaster-config status sync done true 2026-02-08T02:32:17,095 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-inventory-operational status sync done true 2026-02-08T02:32:17,096 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#-137892425], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2026-02-08T02:32:17,097 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#-137892425], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2026-02-08T02:32:17,106 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#-137892425], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 9.237 ms 2026-02-08T02:32:17,124 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | OSGiPasswordServiceConfigBootstrap | 171 - org.opendaylight.aaa.password-service-impl - 0.22.3 | Listening for password service configuration 2026-02-08T02:32:17,125 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config ShiroConfiguration, (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker)), (objectClass=org.opendaylight.aaa.api.IIDMStore)] 2026-02-08T02:32:17,128 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.22.3 is waiting for dependencies [Initial app config AaaCertServiceConfig] 2026-02-08T02:32:17,129 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.api.password.service.PasswordHashService), (objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config ShiroConfiguration, (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2026-02-08T02:32:17,131 | ERROR | opendaylight-cluster-data-notification-dispatcher-43 | H2Store | 168 - org.opendaylight.aaa.idm-store-h2 - 0.22.3 | bundle org.opendaylight.aaa.idm-store-h2:0.22.3 (168)[org.opendaylight.aaa.datastore.h2.H2Store(5)] : Constructor argument 0 in class class org.opendaylight.aaa.datastore.h2.H2Store has unsupported type org.opendaylight.aaa.datastore.h2.ConnectionProvider 2026-02-08T02:32:17,133 | INFO | opendaylight-cluster-data-notification-dispatcher-43 | DefaultPasswordHashService | 171 - org.opendaylight.aaa.password-service-impl - 0.22.3 | DefaultPasswordHashService will utilize default iteration count=20000 2026-02-08T02:32:17,134 | INFO | opendaylight-cluster-data-notification-dispatcher-43 | DefaultPasswordHashService | 171 - org.opendaylight.aaa.password-service-impl - 0.22.3 | DefaultPasswordHashService will utilize default algorithm=SHA-512 2026-02-08T02:32:17,134 | INFO | opendaylight-cluster-data-notification-dispatcher-43 | DefaultPasswordHashService | 171 - org.opendaylight.aaa.password-service-impl - 0.22.3 | DefaultPasswordHashService will not utilize a private salt, since none was configured 2026-02-08T02:32:17,145 | INFO | opendaylight-cluster-data-notification-dispatcher-43 | H2Store | 168 - org.opendaylight.aaa.idm-store-h2 - 0.22.3 | H2 IDMStore activated 2026-02-08T02:32:17,146 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.api.PasswordCredentialAuth), (objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config ShiroConfiguration, (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2026-02-08T02:32:17,148 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config ShiroConfiguration, (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2026-02-08T02:32:17,197 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | EOSClusterSingletonServiceProvider | 258 - org.opendaylight.mdsal.mdsal-singleton-impl - 15.0.2 | Cluster Singleton Service started 2026-02-08T02:32:17,206 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | YangLibraryWriterSingleton | 292 - org.opendaylight.netconf.yanglib-mdsal-writer - 10.0.2 | ietf-yang-library writer registered 2026-02-08T02:32:17,276 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager), Initial app config DatastoreConfig] 2026-02-08T02:32:17,278 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 is waiting for dependencies [(objectClass=org.opendaylight.aaa.cert.api.ICertificateManager)] 2026-02-08T02:32:17,298 | INFO | Blueprint Extender: 3 | AaaCertMdsalProvider | 164 - org.opendaylight.aaa.cert - 0.22.3 | AaaCertMdsalProvider Initialized 2026-02-08T02:32:17,319 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | ArbitratorReconciliationManagerImpl | 297 - org.opendaylight.openflowplugin.applications.arbitratorreconciliation-impl - 0.21.2 | ArbitratorReconciliationManager has started successfully. 2026-02-08T02:32:17,334 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService), (&(|(type=default)(!(type=*)))(objectClass=org.opendaylight.mdsal.binding.api.DataBroker))] 2026-02-08T02:32:17,351 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.21.2 is waiting for dependencies [Initial app config LldpSpeakerConfig] 2026-02-08T02:32:17,356 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#1213047828], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2026-02-08T02:32:17,357 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#1213047828], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2026-02-08T02:32:17,358 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#1213047828], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 1.742 ms 2026-02-08T02:32:17,362 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | DeviceOwnershipService started 2026-02-08T02:32:17,363 | INFO | opendaylight-cluster-data-notification-dispatcher-44 | AAAEncryptionServiceImpl | 166 - org.opendaylight.aaa.encrypt-service-impl - 0.22.3 | AAAEncryptionService activated 2026-02-08T02:32:17,364 | INFO | opendaylight-cluster-data-notification-dispatcher-44 | OSGiEncryptionServiceConfigurator | 166 - org.opendaylight.aaa.encrypt-service-impl - 0.22.3 | Encryption Service enabled 2026-02-08T02:32:17,368 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2026-02-08T02:32:17,396 | INFO | Blueprint Extender: 3 | LazyBindingList | 326 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.20 | Using lazy population for lists larger than 16 element(s) 2026-02-08T02:32:17,419 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | DefaultConfigPusher | 302 - org.opendaylight.openflowplugin.applications.of-switch-config-pusher - 0.21.2 | DefaultConfigPusher has started. 2026-02-08T02:32:17,420 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.21.2 is waiting for dependencies [Initial app config TopologyLldpDiscoveryConfig, (objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2026-02-08T02:32:17,426 | INFO | Blueprint Extender: 1 | LLDPSpeaker | 301 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.21.2 | LLDPSpeaker started, it will send LLDP frames each 5 seconds 2026-02-08T02:32:17,441 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.21.2 is waiting for dependencies [(objectClass=org.opendaylight.openflowplugin.api.openflow.configuration.ConfigurationService)] 2026-02-08T02:32:17,447 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | FlowCapableTopologyProvider | 305 - org.opendaylight.openflowplugin.applications.topology-manager - 0.21.2 | Topology Manager service started. 2026-02-08T02:32:17,458 | INFO | Blueprint Extender: 3 | CertificateManagerService | 164 - org.opendaylight.aaa.cert - 0.22.3 | Certificate Manager service has been initialized 2026-02-08T02:32:17,476 | INFO | Blueprint Extender: 3 | CertificateManagerService | 164 - org.opendaylight.aaa.cert - 0.22.3 | AaaCert Rpc Service has been initialized 2026-02-08T02:32:17,490 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.cert/0.22.3 has been started 2026-02-08T02:32:17,496 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.opendaylight.aaa.cert_0.22.3 [164] was successfully created 2026-02-08T02:32:17,501 | INFO | Blueprint Extender: 1 | NodeConnectorInventoryEventTranslator | 301 - org.opendaylight.openflowplugin.applications.lldp-speaker - 0.21.2 | NodeConnectorInventoryEventTranslator has started. 2026-02-08T02:32:17,503 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.lldp-speaker/0.21.2 has been started 2026-02-08T02:32:17,505 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.lldp-speaker_0.21.2 [301] was successfully created 2026-02-08T02:32:17,518 | INFO | Blueprint Extender: 2 | StoreBuilder | 163 - org.opendaylight.aaa.authn-api - 0.22.3 | Checking if default entries must be created in IDM store 2026-02-08T02:32:17,539 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | OSGiSwitchConnectionProviders | 317 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.21.2 | MD-SAL configuration-based SwitchConnectionProviders started 2026-02-08T02:32:17,543 | INFO | opendaylight-cluster-data-notification-dispatcher-44 | OSGiSwitchConnectionProviders | 317 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.21.2 | Starting instance of type 'openflow-switch-connection-provider-default-impl' 2026-02-08T02:32:17,566 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Loading properties from '(urn:opendaylight:params:xml:ns:yang:openflow:provider:config?revision=2016-05-10)openflow-provider-config' YANG file 2026-02-08T02:32:17,567 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | rpc-requests-quota configuration property was changed to '20000' 2026-02-08T02:32:17,568 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | global-notification-quota configuration property was changed to '64000' 2026-02-08T02:32:17,568 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | switch-features-mandatory configuration property was changed to 'false' 2026-02-08T02:32:17,568 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | enable-flow-removed-notification configuration property was changed to 'true' 2026-02-08T02:32:17,568 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-statistics-rpc-enabled configuration property was changed to 'false' 2026-02-08T02:32:17,568 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | barrier-count-limit configuration property was changed to '25600' 2026-02-08T02:32:17,568 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | barrier-interval-timeout-limit configuration property was changed to '500' 2026-02-08T02:32:17,568 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | echo-reply-timeout configuration property was changed to '2000' 2026-02-08T02:32:17,568 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-statistics-polling-on configuration property was changed to 'true' 2026-02-08T02:32:17,568 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-table-statistics-polling-on configuration property was changed to 'true' 2026-02-08T02:32:17,568 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-flow-statistics-polling-on configuration property was changed to 'true' 2026-02-08T02:32:17,568 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-group-statistics-polling-on configuration property was changed to 'true' 2026-02-08T02:32:17,569 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-meter-statistics-polling-on configuration property was changed to 'true' 2026-02-08T02:32:17,569 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-port-statistics-polling-on configuration property was changed to 'true' 2026-02-08T02:32:17,569 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | is-queue-statistics-polling-on configuration property was changed to 'true' 2026-02-08T02:32:17,569 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | skip-table-features configuration property was changed to 'true' 2026-02-08T02:32:17,569 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | basic-timer-delay configuration property was changed to '3000' 2026-02-08T02:32:17,569 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | maximum-timer-delay configuration property was changed to '900000' 2026-02-08T02:32:17,569 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | use-single-layer-serialization configuration property was changed to 'true' 2026-02-08T02:32:17,569 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | thread-pool-min-threads configuration property was changed to '1' 2026-02-08T02:32:17,569 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | thread-pool-max-threads configuration property was changed to '32000' 2026-02-08T02:32:17,569 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | thread-pool-timeout configuration property was changed to '60' 2026-02-08T02:32:17,569 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | device-connection-rate-limit-per-min configuration property was changed to '0' 2026-02-08T02:32:17,569 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | device-connection-hold-time-in-seconds configuration property was changed to '0' 2026-02-08T02:32:17,570 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | device-datastore-removal-delay configuration property was changed to '500' 2026-02-08T02:32:17,570 | INFO | Blueprint Extender: 3 | OSGiConfigurationServiceFactory | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Loading configuration from 'org.opendaylight.openflowplugin' configuration file 2026-02-08T02:32:17,574 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | felix.fileinstall.filename configuration property was changed to 'file:/tmp/karaf-0.23.1-SNAPSHOT/etc/org.opendaylight.openflowplugin.cfg' 2026-02-08T02:32:17,575 | INFO | Blueprint Extender: 3 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | service.pid configuration property was changed to 'org.opendaylight.openflowplugin' 2026-02-08T02:32:17,620 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | CommandExtension | 121 - org.apache.karaf.shell.core - 4.4.8 | Registering commands for bundle org.opendaylight.openflowplugin.srm-shell/0.21.2 2026-02-08T02:32:17,666 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | org.opendaylight.openflowplugin.applications.frm.impl.ForwardingRulesManagerImpl@6ba43fac was registered as configuration listener to OpenFlowPlugin configuration service 2026-02-08T02:32:17,672 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | OSGiFactorySwitchConnectionConfiguration | 317 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.21.2 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] 2026-02-08T02:32:17,676 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | OSGiFactorySwitchConnectionConfiguration | 317 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.21.2 | Checking presence of configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] 2026-02-08T02:32:17,676 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | OSGiDistributedDataStore | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Distributed Datastore type CONFIGURATION started 2026-02-08T02:32:17,685 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | OSGiFactorySwitchConnectionConfiguration | 317 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.21.2 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-default-impl}] already present 2026-02-08T02:32:17,686 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | OSGiFactorySwitchConnectionConfiguration | 317 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.21.2 | Configuration for (urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)switch-connection-config[{(urn:opendaylight:params:xml:ns:yang:openflow:switch:connection:config?revision=2016-05-06)instance-name=openflow-switch-connection-provider-legacy-impl}] already present 2026-02-08T02:32:17,740 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | YangLibraryWriter | 292 - org.opendaylight.netconf.yanglib-mdsal-writer - 10.0.2 | ietf-yang-library writer started with modules-state enabled 2026-02-08T02:32:17,775 | INFO | opendaylight-cluster-data-notification-dispatcher-44 | OSGiSwitchConnectionProviders | 317 - org.opendaylight.openflowplugin.openflowjava.blueprint-config - 0.21.2 | Starting instance of type 'openflow-switch-connection-provider-legacy-impl' 2026-02-08T02:32:17,779 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#-570663648], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent} 2026-02-08T02:32:17,780 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#-570663648], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} 2026-02-08T02:32:17,781 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#-570663648], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} in 1.362 ms 2026-02-08T02:32:17,856 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | FlowCapableTopologyProvider | 305 - org.opendaylight.openflowplugin.applications.topology-manager - 0.21.2 | Topology node flow:1 is successfully written to the operational datastore. 2026-02-08T02:32:17,857 | INFO | Blueprint Extender: 1 | ForwardingRulesManagerImpl | 300 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.21.2 | ForwardingRulesManager has started successfully. 2026-02-08T02:32:17,859 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager/0.21.2 has been started 2026-02-08T02:32:17,863 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.forwardingrules-manager_0.21.2 [300] was successfully created 2026-02-08T02:32:17,919 | INFO | Blueprint Extender: 1 | ConfigurationServiceFactoryImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | org.opendaylight.openflowplugin.applications.topology.lldp.LLDPLinkAger@2ed0a4f2 was registered as configuration listener to OpenFlowPlugin configuration service 2026-02-08T02:32:17,941 | INFO | Blueprint Extender: 2 | StoreBuilder | 163 - org.opendaylight.aaa.authn-api - 0.22.3 | Found default domain in IDM store, skipping insertion of default data 2026-02-08T02:32:17,943 | INFO | Blueprint Extender: 2 | AAAShiroProvider | 173 - org.opendaylight.aaa.shiro - 0.22.3 | AAAShiroProvider Session Initiated 2026-02-08T02:32:17,957 | INFO | Blueprint Extender: 1 | LLDPActivator | 304 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.21.2 | Starting LLDPActivator with lldpSecureKey: aa9251f8-c7c0-4322-b8d6-c3a84593bda3 2026-02-08T02:32:17,958 | INFO | Blueprint Extender: 1 | LLDPActivator | 304 - org.opendaylight.openflowplugin.applications.topology-lldp-discovery - 0.21.2 | LLDPDiscoveryListener started. 2026-02-08T02:32:17,960 | INFO | Blueprint Extender: 1 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery/0.21.2 has been started 2026-02-08T02:32:17,966 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.opendaylight.openflowplugin.applications.topology-lldp-discovery_0.21.2 [304] was successfully created 2026-02-08T02:32:17,986 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolved shard 2 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational#2084222385], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent} 2026-02-08T02:32:17,986 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=2} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational#2084222385], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2026-02-08T02:32:17,987 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=2} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational#2084222385], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} in 933.4 μs 2026-02-08T02:32:18,028 | INFO | Blueprint Extender: 3 | OpenFlowPluginProviderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | OpenFlowPluginProvider started, waiting for onSystemBootReady() 2026-02-08T02:32:18,029 | INFO | Blueprint Extender: 3 | OpenFlowPluginProviderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@487a42ee 2026-02-08T02:32:18,029 | INFO | Blueprint Extender: 3 | OpenFlowPluginProviderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Added connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@f85ef1f 2026-02-08T02:32:18,034 | INFO | Blueprint Extender: 3 | OnfExtensionProvider | 309 - org.opendaylight.openflowplugin.extension-onf - 0.21.2 | ONF Extension Provider started. 2026-02-08T02:32:18,036 | INFO | Blueprint Extender: 3 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.openflowplugin.impl/0.21.2 has been started 2026-02-08T02:32:18,036 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.opendaylight.openflowplugin.impl_0.21.2 [310] was successfully created 2026-02-08T02:32:18,043 | INFO | Blueprint Extender: 2 | IniSecurityManagerFactory | 172 - org.opendaylight.aaa.repackaged-shiro - 0.22.3 | Realms have been explicitly set on the SecurityManager instance - auto-setting of realms will not occur. 2026-02-08T02:32:18,065 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:32:18,065 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:32:18,070 | INFO | paxweb-config-3-thread-1 | ServerModel | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-11,contextPath='/auth'} 2026-02-08T02:32:18,070 | INFO | paxweb-config-3-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=312, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=173, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}", size=2} 2026-02-08T02:32:18,070 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-11,contextPath='/auth'} 2026-02-08T02:32:18,071 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=312, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=173, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@5fd20251{/auth,null,STOPPED} 2026-02-08T02:32:18,072 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@5fd20251{/auth,null,STOPPED} 2026-02-08T02:32:18,074 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2026-02-08T02:32:18,075 | INFO | paxweb-config-3-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-12,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=2} 2026-02-08T02:32:18,075 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2026-02-08T02:32:18,075 | INFO | Blueprint Extender: 2 | WhiteboardWebServer | 177 - org.opendaylight.aaa.web.osgi-impl - 0.22.3 | Bundle org.opendaylight.aaa.shiro_0.22.3 [173] registered context path /auth with 4 service(s) 2026-02-08T02:32:18,076 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/auth" with default Osgi Context OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=312, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=173, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}} 2026-02-08T02:32:18,078 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 167 - org.opendaylight.aaa.filterchain - 0.22.3 | Initializing CustomFilterAdapter 2026-02-08T02:32:18,079 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 167 - org.opendaylight.aaa.filterchain - 0.22.3 | Injecting a new filter chain with 0 Filters: 2026-02-08T02:32:18,079 | INFO | paxweb-config-3-thread-1 | ContextHandler | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@5fd20251{/auth,null,AVAILABLE} 2026-02-08T02:32:18,080 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-9,name='RealmManagement',path='/auth',bundle=org.opendaylight.aaa.shiro,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=312, osgi.http.whiteboard.context.name=RealmManagement, service.bundleid=173, service.scope=singleton, osgi.http.whiteboard.context.path=/auth}}} as OSGi service for "/auth" context path 2026-02-08T02:32:18,080 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2026-02-08T02:32:18,081 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-13,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2026-02-08T02:32:18,081 | INFO | paxweb-config-3-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-13,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*, /moon/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=2} 2026-02-08T02:32:18,081 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2026-02-08T02:32:18,084 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2026-02-08T02:32:18,084 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2026-02-08T02:32:18,084 | INFO | paxweb-config-3-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]}", size=1} 2026-02-08T02:32:18,085 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-14,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-9,RealmManagement,/auth}]} 2026-02-08T02:32:18,087 | ERROR | Blueprint Extender: 2 | MdsalRestconfServer | 279 - org.opendaylight.netconf.restconf-server-mdsal - 10.0.2 | bundle org.opendaylight.netconf.restconf-server-mdsal:10.0.2 (279)[org.opendaylight.restconf.server.mdsal.MdsalRestconfServer(70)] : Constructor argument 5 in class class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer has unsupported type [Lorg.opendaylight.restconf.server.spi.RpcImplementation; 2026-02-08T02:32:18,234 | INFO | Blueprint Extender: 2 | StoppableHttpServiceFactory | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Binding HTTP Service for bundle: [org.opendaylight.netconf.restconf-server-jaxrs_10.0.2 [278]] 2026-02-08T02:32:18,235 | INFO | paxweb-config-3-thread-1 | ServerModel | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-18,contextPath='/rests'} 2026-02-08T02:32:18,236 | INFO | paxweb-config-3-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=324, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}", size=2} 2026-02-08T02:32:18,236 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-18,contextPath='/rests'} 2026-02-08T02:32:18,237 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=324, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@6b9bd95a{/rests,null,STOPPED} 2026-02-08T02:32:18,238 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@6b9bd95a{/rests,null,STOPPED} 2026-02-08T02:32:18,238 | INFO | Blueprint Extender: 2 | WhiteboardWebServer | 177 - org.opendaylight.aaa.web.osgi-impl - 0.22.3 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_10.0.2 [278] registered context path /rests with 4 service(s) 2026-02-08T02:32:18,238 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-19,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2026-02-08T02:32:18,239 | INFO | paxweb-config-3-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-19,name='org.opendaylight.aaa.filterchain.filters.CustomFilterAdapter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=2} 2026-02-08T02:32:18,239 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2026-02-08T02:32:18,239 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2026-02-08T02:32:18,239 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/rests" with default Osgi Context OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=324, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}} 2026-02-08T02:32:18,240 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 167 - org.opendaylight.aaa.filterchain - 0.22.3 | Initializing CustomFilterAdapter 2026-02-08T02:32:18,240 | INFO | paxweb-config-3-thread-1 | CustomFilterAdapter | 167 - org.opendaylight.aaa.filterchain - 0.22.3 | Injecting a new filter chain with 0 Filters: 2026-02-08T02:32:18,240 | INFO | paxweb-config-3-thread-1 | ContextHandler | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@6b9bd95a{/rests,null,AVAILABLE} 2026-02-08T02:32:18,241 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-16,name='RESTCONF',path='/rests',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=324, osgi.http.whiteboard.context.name=RESTCONF, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/rests}}} as OSGi service for "/rests" context path 2026-02-08T02:32:18,241 | INFO | Blueprint Extender: 2 | WhiteboardWebServer | 177 - org.opendaylight.aaa.web.osgi-impl - 0.22.3 | Bundle org.opendaylight.netconf.restconf-server-jaxrs_10.0.2 [278] registered context path /.well-known with 3 service(s) 2026-02-08T02:32:18,242 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2026-02-08T02:32:18,242 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-20,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2026-02-08T02:32:18,242 | INFO | paxweb-config-3-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-20,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=2} 2026-02-08T02:32:18,242 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2026-02-08T02:32:18,243 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2026-02-08T02:32:18,243 | INFO | Blueprint Extender: 2 | YangLibraryWriterSingleton | 292 - org.opendaylight.netconf.yanglib-mdsal-writer - 10.0.2 | Binding URL provider org.opendaylight.restconf.server.jaxrs.JaxRsYangLibrary@3c6cb445 2026-02-08T02:32:18,243 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2026-02-08T02:32:18,243 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2026-02-08T02:32:18,243 | INFO | paxweb-config-3-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]}", size=1} 2026-02-08T02:32:18,243 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-21,name='org.glassfish.jersey.servlet.ServletContainer',urlPatterns=[/*],contexts=[{WB,OCM-16,RESTCONF,/rests}]} 2026-02-08T02:32:18,244 | INFO | paxweb-config-3-thread-1 | ServerModel | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Created new ServletContextModel{id=ServletContextModel-27,contextPath='/.well-known'} 2026-02-08T02:32:18,244 | INFO | paxweb-config-3-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of OsgiContextModel{WB,id=OCM-22,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=328, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}", size=2} 2026-02-08T02:32:18,244 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Creating new Jetty context for ServletContextModel{id=ServletContextModel-27,contextPath='/.well-known'} 2026-02-08T02:32:18,245 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding OsgiContextModel{WB,id=OCM-22,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=328, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} to o.o.p.w.s.j.i.PaxWebServletContextHandler@7ba597e{/.well-known,null,STOPPED} 2026-02-08T02:32:18,246 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing default OSGi context model for o.o.p.w.s.j.i.PaxWebServletContextHandler@7ba597e{/.well-known,null,STOPPED} 2026-02-08T02:32:18,246 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering FilterModel{id=FilterModel-25,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]} 2026-02-08T02:32:18,246 | INFO | paxweb-config-3-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of FilterModel{id=FilterModel-25,name='org.opendaylight.aaa.shiro.filters.AAAShiroFilter',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]}", size=2} 2026-02-08T02:32:18,246 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /auth 2026-02-08T02:32:18,246 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /rests 2026-02-08T02:32:18,246 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context /.well-known 2026-02-08T02:32:18,247 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Starting Jetty context "/.well-known" with default Osgi Context OsgiContextModel{WB,id=OCM-22,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=328, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}} 2026-02-08T02:32:18,247 | INFO | paxweb-config-3-thread-1 | ContextHandler | 140 - org.eclipse.jetty.util - 9.4.57.v20241219 | Started o.o.p.w.s.j.i.PaxWebServletContextHandler@7ba597e{/.well-known,null,AVAILABLE} 2026-02-08T02:32:18,247 | INFO | paxweb-config-3-thread-1 | OsgiServletContext | 398 - org.ops4j.pax.web.pax-web-spi - 8.0.33 | Registering OsgiServletContext{model=OsgiContextModel{WB,id=OCM-22,name='WellKnownURIs',path='/.well-known',bundle=org.opendaylight.netconf.restconf-server-jaxrs,ref={org.osgi.service.http.context.ServletContextHelper}={service.id=328, osgi.http.whiteboard.context.name=WellKnownURIs, service.bundleid=278, service.scope=singleton, osgi.http.whiteboard.context.path=/.well-known}}} as OSGi service for "/.well-known" context path 2026-02-08T02:32:18,252 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Changing filter configuration for context / 2026-02-08T02:32:18,252 | INFO | paxweb-config-3-thread-1 | HttpServiceEnabled | 397 - org.ops4j.pax.web.pax-web-runtime - 8.0.33 | Registering ServletModel{id=ServletModel-26,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]} 2026-02-08T02:32:18,255 | INFO | paxweb-config-3-thread-1 | JettyServerController | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Receiving Batch{"Registration of ServletModel{id=ServletModel-26,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]}", size=1} 2026-02-08T02:32:18,255 | INFO | paxweb-config-3-thread-1 | JettyServerWrapper | 395 - org.ops4j.pax.web.pax-web-jetty - 8.0.33 | Adding servlet ServletModel{id=ServletModel-26,name='Rootfound',urlPatterns=[/*],contexts=[{WB,OCM-22,WellKnownURIs,/.well-known}]} 2026-02-08T02:32:18,345 | INFO | Blueprint Extender: 2 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4AddressNoZone 2026-02-08T02:32:18,346 | INFO | Blueprint Extender: 2 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv4Prefix 2026-02-08T02:32:18,346 | INFO | Blueprint Extender: 2 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6AddressNoZone 2026-02-08T02:32:18,347 | INFO | Blueprint Extender: 2 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.inet.types.rev130715.Ipv6Prefix 2026-02-08T02:32:18,382 | INFO | Blueprint Extender: 2 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 10.0.2 | Initialized with service class org.opendaylight.restconf.server.mdsal.MdsalRestconfServer 2026-02-08T02:32:18,382 | INFO | Blueprint Extender: 2 | RestconfTransportChannelListener | 276 - org.opendaylight.netconf.restconf-server - 10.0.2 | Initialized with base path: /restconf, default encoding: JSON, default pretty print: false 2026-02-08T02:32:18,435 | INFO | Blueprint Extender: 2 | OSGiNorthbound | 275 - org.opendaylight.netconf.restconf-nb - 10.0.2 | Global RESTCONF northbound pools started 2026-02-08T02:32:18,436 | INFO | Blueprint Extender: 2 | BlueprintContainerImpl | 80 - org.apache.aries.blueprint.core - 1.10.3 | Blueprint bundle org.opendaylight.aaa.shiro/0.22.3 has been started 2026-02-08T02:32:18,437 | INFO | Blueprint Event Dispatcher: 1 | BlueprintBundleTracker | 180 - org.opendaylight.controller.blueprint - 12.0.3 | Blueprint container for bundle org.opendaylight.aaa.shiro_0.22.3 [173] was successfully created 2026-02-08T02:32:18,451 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:32:18,451 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | MdsalRestconfStreamRegistry | 279 - org.opendaylight.netconf.restconf-server-mdsal - 10.0.2 | Cluster leadership acquired – will write OPERATIONAL view 2026-02-08T02:32:18,574 | INFO | SystemReadyService-0 | KarafSystemReady | 242 - org.opendaylight.infrautils.ready-impl - 7.1.9 | checkBundleDiagInfos: Elapsed time 16s, remaining time 283s, diag: Active {INSTALLED=0, RESOLVED=10, UNKNOWN=0, GRACE_PERIOD=0, WAITING=0, STARTING=0, ACTIVE=396, STOPPING=0, FAILURE=0} 2026-02-08T02:32:18,575 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 241 - org.opendaylight.infrautils.ready-api - 7.1.9 | System ready; AKA: Aye captain, all warp coils are now operating at peak efficiency! [M.] 2026-02-08T02:32:18,575 | INFO | SystemReadyService-0 | SimpleSystemReadyMonitor | 241 - org.opendaylight.infrautils.ready-api - 7.1.9 | Now notifying all its registered SystemReadyListeners... 2026-02-08T02:32:18,575 | INFO | SystemReadyService-0 | OpenFlowPluginProviderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | onSystemBootReady() received, starting the switch connections 2026-02-08T02:32:18,676 | INFO | multiThreadIoEventLoopGroup-2-1 | TcpServerFacade | 319 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.21.2 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6653 2026-02-08T02:32:18,678 | INFO | multiThreadIoEventLoopGroup-2-1 | SwitchConnectionProviderImpl | 319 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.21.2 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6653 2026-02-08T02:32:18,680 | INFO | multiThreadIoEventLoopGroup-2-1 | OpenFlowPluginProviderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@487a42ee started 2026-02-08T02:32:18,681 | INFO | multiThreadIoEventLoopGroup-4-1 | TcpServerFacade | 319 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.21.2 | Switch listener started and ready to accept incoming TCP/TLS connections on /[0:0:0:0:0:0:0:0]:6633 2026-02-08T02:32:18,681 | INFO | multiThreadIoEventLoopGroup-4-1 | SwitchConnectionProviderImpl | 319 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.21.2 | Started TCP connection on /[0:0:0:0:0:0:0:0]:6633 2026-02-08T02:32:18,682 | INFO | multiThreadIoEventLoopGroup-4-1 | OpenFlowPluginProviderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Connection provider org.opendaylight.openflowjava.protocol.impl.core.SwitchConnectionProviderImpl@f85ef1f started 2026-02-08T02:32:18,682 | INFO | multiThreadIoEventLoopGroup-4-1 | OpenFlowPluginProviderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | All switchConnectionProviders are up and running (2). 2026-02-08T02:32:28,029 | INFO | qtp1455505446-360 | AuthenticationManager | 175 - org.opendaylight.aaa.tokenauthrealm - 0.22.3 | Authentication is now enabled 2026-02-08T02:32:28,029 | INFO | qtp1455505446-360 | AuthenticationManager | 175 - org.opendaylight.aaa.tokenauthrealm - 0.22.3 | Authentication Manager activated 2026-02-08T02:32:29,443 | INFO | qtp1455505446-360 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 10.0.2 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2026-02-08T02:32:29,446 | INFO | qtp1455505446-360 | JaxRsRestconf | 278 - org.opendaylight.netconf.restconf-server-jaxrs - 10.0.2 | RESTCONF data-missing condition is reported as HTTP status 409 (RFC8040) 2026-02-08T02:32:29,658 | INFO | qtp1455505446-360 | ApiPathParser | 273 - org.opendaylight.netconf.restconf-api - 10.0.2 | Consecutive slashes in REST URLs will be rejected 2026-02-08T02:32:33,279 | INFO | sshd-SshServer[4d81e27f](port=8101)-nio2-thread-2 | ServerSessionImpl | 126 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.170.170:43384 authenticated 2026-02-08T02:32:33,932 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Cluster Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Cluster Restart 2026-02-08T02:32:35,733 | INFO | qtp1455505446-362 | StaticConfiguration | 244 - org.opendaylight.mdsal.binding-dom-adapter - 15.0.2 | Binding-over-DOM codec shortcuts are enabled 2026-02-08T02:32:35,752 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config#-1774931601], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent} 2026-02-08T02:32:35,753 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolving connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=1} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config#-1774931601], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2026-02-08T02:32:35,753 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: replaced connection ConnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=1} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config#-1774931601], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} in 447.8 μs 2026-02-08T02:32:38,675 | INFO | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Total Flows read: 1000 2026-02-08T02:32:39,583 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node1" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node1 2026-02-08T02:32:42,367 | INFO | multiThreadIoEventLoopGroup-5-1 | SystemNotificationsListenerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Connection closed by device, Device:/10.30.170.67:48060, NodeId:null 2026-02-08T02:32:42,461 | INFO | multiThreadIoEventLoopGroup-5-2 | ConnectionAdapterImpl | 319 - org.opendaylight.openflowplugin.openflowjava.openflow-protocol-impl - 0.21.2 | Hello received 2026-02-08T02:32:42,470 | INFO | multiThreadIoEventLoopGroup-5-2 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 connected. 2026-02-08T02:32:42,470 | INFO | multiThreadIoEventLoopGroup-5-2 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | No context chain found for device: openflow:1, creating new. 2026-02-08T02:32:42,471 | INFO | multiThreadIoEventLoopGroup-5-2 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Device connected to controller, Device:/10.30.170.67:48066, NodeId:Uri{value=openflow:1} 2026-02-08T02:32:42,523 | INFO | multiThreadIoEventLoopGroup-5-2 | RoleContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Started timer for setting SLAVE role on device openflow:1 if no role will be set in 20s. 2026-02-08T02:32:42,571 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:32:42,652 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_GRANTED [wasOwner=false, isOwner=true, hasOwner=true] 2026-02-08T02:32:42,654 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting DeviceContextImpl[NEW] service for node openflow:1 2026-02-08T02:32:42,665 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Cluster Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Cluster Restart 2026-02-08T02:32:42,678 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting RpcContextImpl[NEW] service for node openflow:1 2026-02-08T02:32:42,729 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting StatisticsContextImpl[NEW] service for node openflow:1 2026-02-08T02:32:42,730 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting RoleContextImpl[NEW] service for node openflow:1 2026-02-08T02:32:42,732 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | SetRole called with input:SetRoleInput{controllerRole=BECOMEMASTER, node=NodeRef{value=DataObjectIdentifier[ @ urn.opendaylight.inventory.rev130819.Nodes ... nodes.Node[NodeKey{id=Uri{value=openflow:1}}] ]}} 2026-02-08T02:32:42,732 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Requesting state change to BECOMEMASTER 2026-02-08T02:32:42,732 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | SalRoleRpc | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | RoleChangeTask called on device:openflow:1 OFPRole:BECOMEMASTER 2026-02-08T02:32:42,732 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | getGenerationIdFromDevice called for device: openflow:1 2026-02-08T02:32:42,737 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Started clustering services for node openflow:1 2026-02-08T02:32:42,739 | INFO | multiThreadIoEventLoopGroup-5-2 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | submitRoleChange called for device:Uri{value=openflow:1}, role:BECOMEMASTER 2026-02-08T02:32:42,741 | INFO | multiThreadIoEventLoopGroup-5-2 | RoleService | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | submitRoleChange onSuccess for device:Uri{value=openflow:1}, role:BECOMEMASTER 2026-02-08T02:32:42,746 | INFO | ofppool-0 | FlowNodeReconciliationImpl | 300 - org.opendaylight.openflowplugin.applications.forwardingrules-manager - 0.21.2 | Triggering reconciliation for device NodeKey{id=Uri{value=openflow:1}} 2026-02-08T02:32:42,784 | INFO | pool-15-thread-1 | LazyBindingMap | 326 - org.opendaylight.yangtools.binding-data-codec-dynamic - 14.0.20 | Using lazy population for maps larger than 1 element(s) 2026-02-08T02:32:42,954 | INFO | pool-15-thread-1 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 connection is enabled by reconciliation framework. 2026-02-08T02:32:42,986 | INFO | multiThreadIoEventLoopGroup-5-2 | DeviceInitializationUtil | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | IP address of the node openflow:1 is: IpAddress{ipv4Address=Ipv4Address{value=10.30.170.67}} 2026-02-08T02:32:42,986 | INFO | multiThreadIoEventLoopGroup-5-2 | DeviceInitializationUtil | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Port number of the node openflow:1 is: 48066 2026-02-08T02:32:43,239 | INFO | multiThreadIoEventLoopGroup-5-2 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPMETERFEATURES collected 2026-02-08T02:32:43,243 | INFO | multiThreadIoEventLoopGroup-5-2 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPGROUPFEATURES collected 2026-02-08T02:32:43,259 | INFO | multiThreadIoEventLoopGroup-5-2 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.MacAddress 2026-02-08T02:32:43,260 | INFO | multiThreadIoEventLoopGroup-5-2 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.PhysAddress 2026-02-08T02:32:43,260 | INFO | multiThreadIoEventLoopGroup-5-2 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.HexString 2026-02-08T02:32:43,260 | INFO | multiThreadIoEventLoopGroup-5-2 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.DottedQuad 2026-02-08T02:32:43,261 | INFO | multiThreadIoEventLoopGroup-5-2 | StringValueObjectFactory | 332 - org.opendaylight.yangtools.binding-reflect - 14.0.20 | Instantiated factory for class org.opendaylight.yang.gen.v1.urn.ietf.params.xml.ns.yang.ietf.yang.types.rev130715.Uuid 2026-02-08T02:32:43,265 | INFO | multiThreadIoEventLoopGroup-5-2 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 info: OFPMPPORTDESC collected 2026-02-08T02:32:43,294 | INFO | multiThreadIoEventLoopGroup-5-2 | OF13DeviceInitializer | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Static node openflow:1 successfully finished collecting 2026-02-08T02:32:43,346 | INFO | pool-15-thread-1 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 is able to work as master 2026-02-08T02:32:43,346 | INFO | pool-15-thread-1 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Role MASTER was granted to device openflow:1 2026-02-08T02:32:43,347 | INFO | pool-15-thread-1 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Publishing node added notification for Uri{value=openflow:1} 2026-02-08T02:32:43,349 | INFO | pool-15-thread-1 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Starting statistics gathering for node openflow:1 2026-02-08T02:32:43,353 | INFO | opendaylight-cluster-data-notification-dispatcher-44 | ConnectionManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Clearing the device connection timer for the device 1 2026-02-08T02:32:45,279 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node1" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node1 2026-02-08T02:32:45,729 | INFO | multiThreadIoEventLoopGroup-5-2 | SystemNotificationsListenerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | ConnectionEvent: Connection closed by device, Device:/10.30.170.67:48066, NodeId:openflow:1 2026-02-08T02:32:45,730 | INFO | multiThreadIoEventLoopGroup-5-2 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Device openflow:1 disconnected. 2026-02-08T02:32:45,730 | INFO | multiThreadIoEventLoopGroup-5-2 | ReconciliationManagerImpl | 303 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.21.2 | Stopping reconciliation for node Uri{value=openflow:1} 2026-02-08T02:32:45,737 | INFO | multiThreadIoEventLoopGroup-5-2 | DeviceManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Publishing node removed notification for Uri{value=openflow:1} 2026-02-08T02:32:45,741 | INFO | multiThreadIoEventLoopGroup-5-2 | ReconciliationManagerImpl | 303 - org.opendaylight.openflowplugin.applications.reconciliation-framework - 0.21.2 | Stopping reconciliation for node Uri{value=openflow:1} 2026-02-08T02:32:45,741 | INFO | multiThreadIoEventLoopGroup-5-2 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Role SLAVE was granted to device openflow:1 2026-02-08T02:32:45,742 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping RoleContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:32:45,742 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping StatisticsContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:32:45,742 | INFO | multiThreadIoEventLoopGroup-5-2 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping running statistics gathering for node openflow:1 2026-02-08T02:32:45,744 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping RpcContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:32:45,747 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping DeviceContextImpl[RUNNING] service for node openflow:1 2026-02-08T02:32:45,751 | INFO | ofppool-0 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Closed clustering services for node openflow:1 2026-02-08T02:32:45,751 | INFO | multiThreadIoEventLoopGroup-5-2 | ContextChainImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Closed clustering services registration for node openflow:1 2026-02-08T02:32:45,752 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating DeviceContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:32:45,755 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating RpcContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:32:45,755 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating StatisticsContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:32:45,755 | INFO | multiThreadIoEventLoopGroup-5-2 | StatisticsContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Stopping running statistics gathering for node openflow:1 2026-02-08T02:32:45,758 | INFO | multiThreadIoEventLoopGroup-5-2 | GuardedContextImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Terminating RoleContextImpl[TERMINATED] service for node openflow:1 2026-02-08T02:32:45,811 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2026-02-08T02:32:45,811 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : LOCAL_OWNERSHIP_LOST_NO_OWNER [wasOwner=true, isOwner=false, hasOwner=false] 2026-02-08T02:32:46,328 | INFO | node-cleaner-0 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Try to remove device openflow:1 from operational DS 2026-02-08T02:32:47,766 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node1" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node1 2026-02-08T02:32:48,122 | INFO | qtp1455505446-362 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Ping Pong Flow Tester Impl 2026-02-08T02:32:48,123 | INFO | qtp1455505446-362 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Transaction Chain Flow Writer Impl 2026-02-08T02:32:48,124 | INFO | ForkJoinPool-9-worker-1 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Number of Txn for dpId: openflow:1 is: 1 2026-02-08T02:32:48,125 | INFO | ForkJoinPool-9-worker-1 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Creating new txChain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@42807273 for dpid: openflow:1 2026-02-08T02:32:48,137 | INFO | ForkJoinPool-9-worker-1 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed FlowHandlerTask thread for dpid: openflow:1 2026-02-08T02:32:48,161 | INFO | CommitFutures-1 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Completed all flows installation for: dpid: openflow:1 in 806256510541ns 2026-02-08T02:32:48,161 | INFO | CommitFutures-1 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Transaction chain: org.opendaylight.mdsal.binding.dom.adapter.BindingDOMTransactionChainAdapter@42807273 closed successfully. 2026-02-08T02:32:48,328 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster 2026-02-08T02:32:51,760 | INFO | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Total Flows read: 0 2026-02-08T02:32:52,366 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Leader Before Leader Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Leader Before Leader Restart 2026-02-08T02:32:53,794 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Leader" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Leader 2026-02-08T02:32:56,872 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Leader" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Leader 2026-02-08T02:32:56,893 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2026-02-08T02:32:57,131 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2026-02-08T02:32:57,494 | INFO | opendaylight-cluster-data-notification-dispatcher-46 | ConnectionManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Clearing the device connection timer for the device 1 2026-02-08T02:32:58,452 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Leader Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Leader Restart 2026-02-08T02:33:01,134 | INFO | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Total Flows read: 1000 2026-02-08T02:33:02,511 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Leader Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Leader Restart 2026-02-08T02:33:02,982 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Leader From Cluster Node" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Leader From Cluster Node 2026-02-08T02:33:03,301 | INFO | pipe-log:log "ROBOT MESSAGE: Killing ODL2 10.30.170.53" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Killing ODL2 10.30.170.53 2026-02-08T02:33:06,519 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node 2026-02-08T02:33:08,136 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.170.53:2550, Up)]. 2026-02-08T02:33:08,143 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Received UnreachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:33:08,143 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Received UnreachableMember: memberName MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:33:08,143 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR found unreachable members, waiting for stable-after = 7000 ms before taking downing decision. Now 1 unreachable members found. Downing decision will not be made before 2026-02-08T02:33:15.143301388Z. 2026-02-08T02:33:08,149 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#-137892425], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#-137892425], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2026-02-08T02:33:08,149 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#1213047828], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#1213047828], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2026-02-08T02:33:08,150 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: refreshing backend for shard 0 2026-02-08T02:33:08,150 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: refreshing backend for shard 0 2026-02-08T02:33:08,151 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#-570663648], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#-570663648], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} 2026-02-08T02:33:08,151 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: refreshing backend for shard 1 2026-02-08T02:33:08,151 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational#2084222385], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational#2084222385], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} 2026-02-08T02:33:08,151 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: refreshing backend for shard 2 2026-02-08T02:33:08,152 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config#-1774931601], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config#-1774931601], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=inventory, dataTree=absent}} 2026-02-08T02:33:08,152 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: refreshing backend for shard 1 2026-02-08T02:33:11,373 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-821862124] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [3] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:33:11,373 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-821862124] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [4] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:33:11,374 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-2126599309] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:33:11,374 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-821862124] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:33:11,374 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-821862124] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [7] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:33:11,374 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-821862124] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [8] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:33:11,375 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-2126599309] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [9] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:33:11,375 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#2004727713] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [10] dead letters encountered, no more dead letters will be logged in next [5.000 min]. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:33:11,422 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:33:13,412 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.53:2550 is unreachable 2026-02-08T02:33:13,422 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.53:2550 is unreachable 2026-02-08T02:33:13,431 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.53:2550 is unreachable 2026-02-08T02:33:13,441 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.53:2550 is unreachable 2026-02-08T02:33:13,441 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.53:2550 is unreachable 2026-02-08T02:33:13,472 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.53:2550 is unreachable 2026-02-08T02:33:13,501 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.170.53:2550 is unreachable 2026-02-08T02:33:13,680 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Candidate): Starting new election term 5 2026-02-08T02:33:13,680 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Candidate): Starting new election term 6 2026-02-08T02:33:13,680 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 5 2026-02-08T02:33:13,680 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Candidate): Starting new election term 5 2026-02-08T02:33:13,681 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Follower) :- Switching from behavior Follower to Candidate, election term: 6 2026-02-08T02:33:13,681 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Follower) :- Switching from behavior Follower to Candidate, election term: 5 2026-02-08T02:33:13,681 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational (Candidate): Starting new election term 5 2026-02-08T02:33:13,681 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-operational , received role change from Follower to Candidate 2026-02-08T02:33:13,681 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@78265932 2026-02-08T02:33:13,682 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from Follower to Candidate 2026-02-08T02:33:13,682 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@3db67011 2026-02-08T02:33:13,682 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-topology-operational from Follower to Candidate 2026-02-08T02:33:13,682 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Candidate): Starting new election term 5 2026-02-08T02:33:13,682 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Candidate): Starting new election term 5 2026-02-08T02:33:13,683 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 5 2026-02-08T02:33:13,683 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Follower) :- Switching from behavior Follower to Candidate, election term: 5 2026-02-08T02:33:13,683 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@5df8a1f3 2026-02-08T02:33:13,683 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from Follower to Candidate 2026-02-08T02:33:13,681 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 5 2026-02-08T02:33:13,682 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-config , received role change from Follower to Candidate 2026-02-08T02:33:13,683 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from Follower to Candidate 2026-02-08T02:33:13,683 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-config , received role change from Follower to Candidate 2026-02-08T02:33:13,682 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-toaster-config from Follower to Candidate 2026-02-08T02:33:13,684 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@5e5c32a7 2026-02-08T02:33:13,684 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@147c1fd7 2026-02-08T02:33:13,684 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@4c2f3905 2026-02-08T02:33:13,684 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-operational , received role change from Follower to Candidate 2026-02-08T02:33:13,684 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-topology-config from Follower to Candidate 2026-02-08T02:33:13,685 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-toaster-operational from Follower to Candidate 2026-02-08T02:33:13,685 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-default-config from Follower to Candidate 2026-02-08T02:33:13,685 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Candidate): Starting new election term 5 2026-02-08T02:33:13,685 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Follower) :- Switching from behavior Follower to Candidate, election term: 5 2026-02-08T02:33:13,685 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-operational , received role change from Follower to Candidate 2026-02-08T02:33:13,685 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@c3b35c9 2026-02-08T02:33:13,686 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-default-operational from Follower to Candidate 2026-02-08T02:33:15,288 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR took decision DownUnreachable and is downing [pekko://opendaylight-cluster-data@10.30.170.53:2550], [1] unreachable of [3] members, all members in DC [Member(pekko://opendaylight-cluster-data@10.30.170.226:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.170.53:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.171.87:2550, Up)], full reachability status: [pekko://opendaylight-cluster-data@10.30.170.226:2550 -> pekko://opendaylight-cluster-data@10.30.170.53:2550: Unreachable [Unreachable] (1), pekko://opendaylight-cluster-data@10.30.171.87:2550 -> pekko://opendaylight-cluster-data@10.30.170.53:2550: Unreachable [Unreachable] (1)] 2026-02-08T02:33:15,288 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR is downing [UniqueAddress(pekko://opendaylight-cluster-data@10.30.170.53:2550,-1600513275839146841)] 2026-02-08T02:33:15,290 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Marking unreachable node [pekko://opendaylight-cluster-data@10.30.170.53:2550] as [Down] 2026-02-08T02:33:15,291 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR found unreachable members changed during stable-after period. Resetting timer. Now 1 unreachable members found. Downing decision will not be made before 2026-02-08T02:33:22.291647410Z. 2026-02-08T02:33:16,293 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is removing unreachable node [pekko://opendaylight-cluster-data@10.30.170.53:2550] 2026-02-08T02:33:16,296 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received MemberRemoved: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:33:16,297 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received MemberRemoved: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:33:16,297 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | Association | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Association to [pekko://opendaylight-cluster-data@10.30.170.53:2550] with UID [-1600513275839146841] is irrecoverably failed. UID is now quarantined and all messages to this UID will be delivered to dead letters. Remote ActorSystem must be restarted to recover from this situation. Reason: Cluster member removed, previous status [Down] 2026-02-08T02:33:18,938 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Leader from Cluster Node" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Leader from Cluster Node 2026-02-08T02:33:19,187 | INFO | pipe-log:log "ROBOT MESSAGE: Starting ODL2 10.30.170.53" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting ODL2 10.30.170.53 2026-02-08T02:33:23,819 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational (Candidate): Starting new election term 6 2026-02-08T02:33:23,837 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.170.53:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.170.53/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:33:23,941 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Candidate): Starting new election term 6 2026-02-08T02:33:23,941 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Candidate): Starting new election term 6 2026-02-08T02:33:23,941 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Candidate): Starting new election term 6 2026-02-08T02:33:24,056 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Candidate): Starting new election term 7 2026-02-08T02:33:24,056 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Candidate): Starting new election term 6 2026-02-08T02:33:24,058 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Candidate): Starting new election term 6 2026-02-08T02:33:25,415 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#511174323]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2026-02-08T02:33:25,415 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.170.226:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#511174323]] (version [1.2.1]) 2026-02-08T02:33:26,493 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.170.53:2550] to [Up] 2026-02-08T02:33:26,495 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:33:26,495 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-2}, address: pekko://opendaylight-cluster-data@10.30.170.53:2550 2026-02-08T02:33:26,497 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational 2026-02-08T02:33:26,498 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational 2026-02-08T02:33:26,498 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2026-02-08T02:33:26,498 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Peer address for peer member-2-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational 2026-02-08T02:33:26,499 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Peer address for peer member-2-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational 2026-02-08T02:33:26,496 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-default-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config 2026-02-08T02:33:26,498 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Peer address for peer member-2-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational 2026-02-08T02:33:26,498 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2026-02-08T02:33:26,499 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-topology-config 2026-02-08T02:33:26,499 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: Peer address for peer member-2-shard-default-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config 2026-02-08T02:33:26,499 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config 2026-02-08T02:33:26,499 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: Peer address for peer member-2-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-topology-config 2026-02-08T02:33:26,499 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-2-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-toaster-config 2026-02-08T02:33:26,499 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: Peer address for peer member-2-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-toaster-operational 2026-02-08T02:33:26,500 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: Peer address for peer member-2-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-toaster-config 2026-02-08T02:33:28,165 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-topology-operational currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:33:28,171 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-operational currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:33:28,165 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-default-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:33:28,171 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:33:28,165 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-default-operational currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:33:29,941 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2026-02-08T02:33:29,942 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2026-02-08T02:33:30,449 | INFO | node-cleaner-0 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Try to remove device openflow:1 from operational DS 2026-02-08T02:33:34,014 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational (Candidate): Starting new election term 7 2026-02-08T02:33:34,112 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Candidate): Starting new election term 7 2026-02-08T02:33:34,299 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Candidate): Starting new election term 7 2026-02-08T02:33:34,299 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Candidate): Starting new election term 7 2026-02-08T02:33:34,302 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Candidate): Starting new election term 7 2026-02-08T02:33:34,302 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Candidate): Starting new election term 8 2026-02-08T02:33:34,303 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Candidate): Starting new election term 7 2026-02-08T02:33:34,821 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 7 2026-02-08T02:33:34,821 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@6f09be75 2026-02-08T02:33:34,823 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-operational , received role change from Candidate to Leader 2026-02-08T02:33:34,823 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-toaster-operational from Candidate to Leader 2026-02-08T02:33:34,929 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Candidate) :- Switching from behavior Candidate to Leader, election term: 7 2026-02-08T02:33:34,930 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@716119cc 2026-02-08T02:33:34,933 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-inventory-operational , received role change from Candidate to Leader 2026-02-08T02:33:34,934 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-inventory-operational from Candidate to Leader 2026-02-08T02:33:34,935 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=7, success=false, followerId=member-2-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 58, snapshotTerm: 4, replicatedToAllIndex: -1 2026-02-08T02:33:34,935 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): follower member-2-shard-inventory-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2026-02-08T02:33:34,935 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): Initiating install snapshot to follower member-2-shard-inventory-operational: follower nextIndex: 0, leader snapshotIndex: 58, leader lastIndex: 59, leader log size: 1 2026-02-08T02:33:34,942 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=59, lastAppliedTerm=4, lastIndex=59, lastTerm=4, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-2-shard-inventory-operational 2026-02-08T02:33:34,947 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolved shard 2 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#1144303271], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=present} 2026-02-08T02:33:34,948 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational#2084222385], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#1144303271], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=present}} 2026-02-08T02:33:34,956 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Persising snapshot at EntryInfo[index=59, term=4]/EntryInfo[index=59, term=4] 2026-02-08T02:33:34,957 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 58 and term: 4 2026-02-08T02:33:34,961 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-inventory-operational#2084222385], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=2, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#1144303271], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=2, shard=inventory, dataTree=present}} in 12.01 ms 2026-02-08T02:33:35,022 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: snapshot is durable as of 2026-02-08T02:33:34.957106271Z 2026-02-08T02:33:35,102 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Candidate): New Leader member-3-shard-default-config sent an AppendEntries to Candidate for term 7 - will switch to Follower 2026-02-08T02:33:35,103 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 7 2026-02-08T02:33:35,103 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-config , received role change from Candidate to Follower 2026-02-08T02:33:35,104 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-default-config from Candidate to Follower 2026-02-08T02:33:35,138 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Candidate): New Leader member-3-shard-topology-config sent an AppendEntries to Candidate for term 8 - will switch to Follower 2026-02-08T02:33:35,138 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Candidate): New Leader member-3-shard-topology-operational sent an AppendEntries to Candidate for term 7 - will switch to Follower 2026-02-08T02:33:35,138 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 8 2026-02-08T02:33:35,138 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 7 2026-02-08T02:33:35,138 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-config , received role change from Candidate to Follower 2026-02-08T02:33:35,139 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-topology-config from Candidate to Follower 2026-02-08T02:33:35,138 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-operational , received role change from Candidate to Follower 2026-02-08T02:33:35,139 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Candidate): New Leader member-3-shard-default-operational sent an AppendEntries to Candidate for term 7 - will switch to Follower 2026-02-08T02:33:35,139 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-topology-operational from Candidate to Follower 2026-02-08T02:33:35,139 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Candidate) :- Switching from behavior Candidate to Follower, election term: 7 2026-02-08T02:33:35,139 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-default-operational , received role change from Candidate to Follower 2026-02-08T02:33:35,140 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received role changed for member-1-shard-default-operational from Candidate to Follower 2026-02-08T02:33:35,140 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Candidate): New Leader member-3-shard-toaster-config sent an AppendEntries to Candidate for term 7 - will switch to Follower 2026-02-08T02:33:35,140 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 7 2026-02-08T02:33:35,140 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from Candidate to Follower 2026-02-08T02:33:35,140 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-toaster-config from Candidate to Follower 2026-02-08T02:33:35,144 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): Snapshot successfully installed on follower member-2-shard-inventory-operational (last chunk 1) - matchIndex set to 59, nextIndex set to 60 2026-02-08T02:33:35,612 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@688414d0 2026-02-08T02:33:35,612 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-default-config status sync done false 2026-02-08T02:33:35,614 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-default-config status sync done true 2026-02-08T02:33:35,619 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-default-config#1899889342], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2026-02-08T02:33:35,620 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#-137892425], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-default-config#1899889342], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2026-02-08T02:33:35,621 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#-137892425], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-default-config#1899889342], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 605.2 μs 2026-02-08T02:33:35,651 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-default-operational status sync done false 2026-02-08T02:33:35,652 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-topology-operational status sync done false 2026-02-08T02:33:35,652 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@1f83f2d6 2026-02-08T02:33:35,652 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-toaster-config status sync done false 2026-02-08T02:33:35,652 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@171c256f 2026-02-08T02:33:35,653 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: All Shards are ready - data store operational is ready 2026-02-08T02:33:35,652 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@34ea02b2 2026-02-08T02:33:35,654 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@7dbf1e22 2026-02-08T02:33:35,654 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-topology-config status sync done false 2026-02-08T02:33:35,654 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-topology-operational status sync done true 2026-02-08T02:33:35,654 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-default-operational status sync done true 2026-02-08T02:33:35,655 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-default-operational#-1538773787], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2026-02-08T02:33:35,655 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#1213047828], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-default-operational#-1538773787], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2026-02-08T02:33:35,656 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#1213047828], sessionId=0, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-default-operational#-1538773787], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 463.4 μs 2026-02-08T02:33:35,656 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-topology-operational#662716946], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent} 2026-02-08T02:33:35,656 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#-570663648], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-topology-operational#662716946], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} 2026-02-08T02:33:35,658 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#-570663648], sessionId=1, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-topology-operational#662716946], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} in 1.459 ms 2026-02-08T02:33:36,172 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-toaster-config status sync done true 2026-02-08T02:33:36,173 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-topology-config status sync done true 2026-02-08T02:33:41,629 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=179056, lastAppliedTerm=5, lastIndex=180045, lastTerm=5, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=989, mandatoryTrim=false] 2026-02-08T02:33:41,634 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=179056, term=5]/EntryInfo[index=180045, term=5] 2026-02-08T02:33:41,634 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 178081 and term: 5 2026-02-08T02:33:41,654 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: snapshot is durable as of 2026-02-08T02:33:41.634576199Z 2026-02-08T02:33:45,340 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config (Follower): Term 6 in "RequestVote{term=6, candidateId=member-2-shard-inventory-config, lastLogIndex=182708, lastLogTerm=5}" message is greater than follower's term 5 - updating term 2026-02-08T02:33:45,356 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Peer address for peer member-2-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-inventory-config 2026-02-08T02:33:46,318 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-inventory-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=3}, nanosAgo=11384419799, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-2-frontend-datastore-operational, generation=4} 2026-02-08T02:33:49,212 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:33:54,885 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Leader Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Leader Restart 2026-02-08T02:34:10,251 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:34:31,291 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:34:52,331 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:35:13,372 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:35:34,411 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:35:55,452 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:35:55,993 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3013-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.026335975 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3013-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.026335975 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3013-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.026335975 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:36:16,491 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:36:37,531 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:36:58,571 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:37:19,611 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:37:38,592 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3014-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.029977419 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3014-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.029977419 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3014-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.029977419 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:37:40,651 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:37:56,022 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3015-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.025266412 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3015-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.025266412 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3015-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.025266412 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:38:01,691 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:38:22,732 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:38:43,772 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:39:04,812 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:39:21,132 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3016-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.023066711 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3016-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.023066711 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3016-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.023066711 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:39:25,852 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:39:38,623 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3017-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.027389566 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3017-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.027389566 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3017-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.027389566 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:39:46,891 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:39:56,042 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3018-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.01698686 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3018-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.01698686 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3018-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.01698686 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:40:07,931 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:40:28,971 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:40:44,069 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Leader" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Leader 2026-02-08T02:40:47,287 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Leader Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Leader Restart 2026-02-08T02:40:47,771 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2026-02-08T02:40:47,891 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2026-02-08T02:40:48,697 | INFO | opendaylight-cluster-data-notification-dispatcher-51 | ConnectionManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Clearing the device connection timer for the device 1 2026-02-08T02:40:49,813 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node After Leader Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Leader Node After Leader Restart 2026-02-08T02:40:50,011 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:40:50,332 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2026-02-08T02:40:50,332 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2026-02-08T02:40:50,840 | INFO | node-cleaner-1 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Try to remove device openflow:1 from operational DS 2026-02-08T02:40:52,333 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Leader Node" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Leader Node 2026-02-08T02:40:53,895 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Leader Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Leader Restart 2026-02-08T02:41:03,502 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3019-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.020642272 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3019-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.020642272 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3019-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.1}]} timed out after 120.020642272 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:41:11,051 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:41:21,161 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3020-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.026463588 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3020-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.026463588 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3020-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.026463588 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:41:32,092 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:41:38,652 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3021-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.026179274 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3021-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.026179274 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3021-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.026179274 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:41:53,132 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:41:56,062 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3022-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.015604616 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3022-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.015604616 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3022-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.015604616 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:42:14,172 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:42:35,211 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:42:35,454 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before follower Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Inventory Follower Before follower Restart 2026-02-08T02:42:36,625 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node2" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Connect To Follower Node2 2026-02-08T02:42:39,787 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower Node2" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Add Bulk Flow From Follower Node2 2026-02-08T02:42:40,312 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2026-02-08T02:42:40,671 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2026-02-08T02:42:56,252 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:43:03,532 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3023-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.026060077 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3023-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.026060077 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3023-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.2}]} timed out after 120.026060077 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:43:17,291 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:43:21,182 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3024-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.017908613 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3024-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.017908613 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3024-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.017908613 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:43:38,331 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:43:38,671 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3025-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.015579032 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3025-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.015579032 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3025-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.015579032 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:43:56,084 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3026-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.018073415 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3026-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.018073415 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3026-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.018073415 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:43:59,371 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:44:20,412 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:44:21,189 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Follower Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Get Bulk Flows and Verify In Cluster Before Follower Restart 2026-02-08T02:44:41,078 | INFO | opendaylight-cluster-data-notification-dispatcher-57 | ConnectionManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Clearing the device connection timer for the device 1 2026-02-08T02:44:41,452 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:45:02,492 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:45:03,562 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3027-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.026052917 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3027-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.026052917 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3027-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.3}]} timed out after 120.026052917 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:45:21,212 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3028-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.026716403 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3028-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.026716403 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3028-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.026716403 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:45:23,532 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:45:38,692 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3029-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.018041537 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3029-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.018041537 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3029-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.018041537 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:45:44,571 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:45:56,114 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3030-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.024790629 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3030-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.024790629 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3030-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.024790629 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:46:02,524 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Follower Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch Before Follower Restart 2026-02-08T02:46:05,602 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:46:26,641 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:46:47,682 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:47:03,592 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3031-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.027394227 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3031-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.027394227 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3031-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.4}]} timed out after 120.027394227 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:47:08,722 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:47:21,241 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3032-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.026959669 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3032-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.026959669 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3032-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.026959669 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:47:29,761 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:47:38,722 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3033-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.026538567 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3033-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.026538567 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3033-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.026538567 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:47:44,891 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Follower Node2" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Kill Follower Node2 2026-02-08T02:47:45,186 | INFO | pipe-log:log "ROBOT MESSAGE: Killing ODL3 10.30.171.87" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Killing ODL3 10.30.171.87 2026-02-08T02:47:49,106 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2 and Exit" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2 and Exit 2026-02-08T02:47:50,270 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Received UnreachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.87:2550 2026-02-08T02:47:50,270 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Received UnreachableMember: memberName MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.87:2550 2026-02-08T02:47:50,272 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-default-config#1899889342], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-default-config#1899889342], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2026-02-08T02:47:50,272 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR found unreachable members, waiting for stable-after = 7000 ms before taking downing decision. Now 1 unreachable members found. Downing decision will not be made before 2026-02-08T02:47:57.271530923Z. 2026-02-08T02:47:50,272 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: refreshing backend for shard 0 2026-02-08T02:47:50,272 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-default-operational#-1538773787], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-default-operational#-1538773787], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2026-02-08T02:47:50,272 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: refreshing backend for shard 0 2026-02-08T02:47:50,273 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: connection ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-topology-operational#662716946], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} reconnecting as ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-topology-operational#662716946], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} 2026-02-08T02:47:50,273 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: refreshing backend for shard 1 2026-02-08T02:47:50,552 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Marking node as UNREACHABLE [Member(pekko://opendaylight-cluster-data@10.30.171.87:2550, Up)]. 2026-02-08T02:47:50,802 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:47:53,362 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-821862124] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [80] dead letters encountered, of which 69 were not logged. The counter will be reset now. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:47:53,363 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#1144303271] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [1] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:47:53,363 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/action-registry/gossiper#-2126599309] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [2] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:47:53,363 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#2004727713] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [3] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:47:53,363 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-toaster-operational#928036672] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [4] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:47:53,364 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-821862124] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [5] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:47:53,364 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/clusterReceptionist/replicator#1000135650] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [6] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:47:53,364 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.opendaylight.controller.cluster.raft.messages.AppendEntries] from Actor[pekko://opendaylight-cluster-data/user/shardmanager-operational/member-1-shard-inventory-operational#1144303271] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [7] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:47:53,364 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-821862124] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [8] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:47:53,364 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.opendaylight.controller.remote.rpc.registry.gossip.GossipStatus] from Actor[pekko://opendaylight-cluster-data/user/rpc/registry/gossiper#2004727713] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [9] dead letters encountered. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:47:53,364 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RemoteActorRefProvider$RemoteDeadLetterActorRef | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Message [org.apache.pekko.cluster.ddata.Replicator$Internal$Status] from Actor[pekko://opendaylight-cluster-data/system/ddataReplicator#-821862124] to Actor[pekko://opendaylight-cluster-data/deadLetters] was not delivered. [10] dead letters encountered, no more dead letters will be logged in next [5.000 min]. If this is not an expected behavior then Actor[pekko://opendaylight-cluster-data/deadLetters] may have terminated unexpectedly. This logging can be turned off or adjusted with configuration settings 'pekko.log-dead-letters' and 'pekko.log-dead-letters-during-shutdown'. 2026-02-08T02:47:53,459 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.87:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.87/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:47:55,459 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational (Follower): Term 8 in "RequestVote{term=8, candidateId=member-2-shard-topology-operational, lastLogIndex=36, lastLogTerm=7}" message is greater than follower's term 7 - updating term 2026-02-08T02:47:55,471 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.87:2550 is unreachable 2026-02-08T02:47:55,474 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@78aa9c07 2026-02-08T02:47:55,475 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-topology-operational status sync done false 2026-02-08T02:47:55,475 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Candidate): Starting new election term 8 2026-02-08T02:47:55,476 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Follower) :- Switching from behavior Follower to Candidate, election term: 8 2026-02-08T02:47:55,476 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-topology-operational status sync done true 2026-02-08T02:47:55,476 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@22ab9c0a 2026-02-08T02:47:55,477 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from Follower to Candidate 2026-02-08T02:47:55,477 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-toaster-config from Follower to Candidate 2026-02-08T02:47:55,483 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolved shard 1 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#798315389], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent} 2026-02-08T02:47:55,484 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-topology-operational#662716946], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#798315389], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} 2026-02-08T02:47:55,484 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-topology-operational#662716946], sessionId=5, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=1, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-topology-operational#798315389], sessionId=6, version=POTASSIUM, maxMessages=1000, cookie=1, shard=topology, dataTree=absent}} in 661.4 μs 2026-02-08T02:47:55,487 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config (Candidate) :- Switching from behavior Candidate to Leader, election term: 8 2026-02-08T02:47:55,487 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@771e360d 2026-02-08T02:47:55,487 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-toaster-config , received role change from Candidate to Leader 2026-02-08T02:47:55,487 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-toaster-config from Candidate to Leader 2026-02-08T02:47:55,495 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational (Follower): Term 8 in "RequestVote{term=8, candidateId=member-2-shard-default-operational, lastLogIndex=52, lastLogTerm=7}" message is greater than follower's term 7 - updating term 2026-02-08T02:47:55,503 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@327ce436 2026-02-08T02:47:55,504 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: All Shards are ready - data store operational is ready 2026-02-08T02:47:55,504 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-default-operational status sync done false 2026-02-08T02:47:55,505 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational Received follower initial sync status for member-1-shard-default-operational status sync done true 2026-02-08T02:47:55,507 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#-1761065231], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2026-02-08T02:47:55,507 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-default-operational#-1538773787], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#-1761065231], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2026-02-08T02:47:55,507 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-operational: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-default-operational#-1538773787], sessionId=4, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-operational, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-operational/member-2-shard-default-operational#-1761065231], sessionId=7, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 370.6 μs 2026-02-08T02:47:55,546 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config (Follower): Term 8 in "RequestVote{term=8, candidateId=member-2-shard-default-config, lastLogIndex=207, lastLogTerm=7}" message is greater than follower's term 7 - updating term 2026-02-08T02:47:55,553 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-default-config status sync done false 2026-02-08T02:47:55,553 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@12595e59 2026-02-08T02:47:55,554 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-default-config status sync done true 2026-02-08T02:47:55,555 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolved shard 0 to ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#1619989626], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent} 2026-02-08T02:47:55,555 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: resolving connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-default-config#1899889342], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} to ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#1619989626], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} 2026-02-08T02:47:55,556 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ClientActorBehavior | 182 - org.opendaylight.controller.cds-access-client - 12.0.3 | member-1-frontend-datastore-config: replaced connection ReconnectingClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-default-config#1899889342], sessionId=2, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} with ConnectedClientConnection{client=ClientIdentifier{frontend=member-1-frontend-datastore-config, generation=1}, cookie=0, backend=ShardBackendInfo{actor=Actor[pekko://opendaylight-cluster-data@10.30.170.53:2550/user/shardmanager-config/member-2-shard-default-config#1619989626], sessionId=3, version=POTASSIUM, maxMessages=1000, cookie=0, shard=default, dataTree=absent}} in 349.8 μs 2026-02-08T02:47:55,651 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Follower | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Follower): Leader pekko://opendaylight-cluster-data@10.30.171.87:2550 is unreachable 2026-02-08T02:47:55,654 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Candidate): Starting new election term 9 2026-02-08T02:47:55,654 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Follower) :- Switching from behavior Follower to Candidate, election term: 9 2026-02-08T02:47:55,654 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@786344b4 2026-02-08T02:47:55,654 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-config , received role change from Follower to Candidate 2026-02-08T02:47:55,654 | INFO | opendaylight-cluster-data-shard-dispatcher-37 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-topology-config from Follower to Candidate 2026-02-08T02:47:56,142 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3034-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.023967271 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3034-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.023967271 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3034-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.023967271 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:47:57,473 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR took decision DownUnreachable and is downing [pekko://opendaylight-cluster-data@10.30.171.87:2550], [1] unreachable of [3] members, all members in DC [Member(pekko://opendaylight-cluster-data@10.30.170.226:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.170.53:2550, Up), Member(pekko://opendaylight-cluster-data@10.30.171.87:2550, Up)], full reachability status: [pekko://opendaylight-cluster-data@10.30.170.226:2550 -> pekko://opendaylight-cluster-data@10.30.171.87:2550: Unreachable [Unreachable] (2), pekko://opendaylight-cluster-data@10.30.170.53:2550 -> pekko://opendaylight-cluster-data@10.30.171.87:2550: Unreachable [Unreachable] (1)] 2026-02-08T02:47:57,473 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR is downing [UniqueAddress(pekko://opendaylight-cluster-data@10.30.171.87:2550,2172312295193270143)] 2026-02-08T02:47:57,474 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Marking unreachable node [pekko://opendaylight-cluster-data@10.30.171.87:2550] as [Down] 2026-02-08T02:47:57,474 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | SplitBrainResolver | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | SBR found unreachable members changed during stable-after period. Resetting timer. Now 1 unreachable members found. Downing decision will not be made before 2026-02-08T02:48:04.474520779Z. 2026-02-08T02:47:57,603 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is removing unreachable node [pekko://opendaylight-cluster-data@10.30.171.87:2550] 2026-02-08T02:47:57,604 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received MemberRemoved: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.87:2550 2026-02-08T02:47:57,604 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received MemberRemoved: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.87:2550 2026-02-08T02:47:57,606 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Association | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Association to [pekko://opendaylight-cluster-data@10.30.171.87:2550] with UID [2172312295193270143] is irrecoverably failed. UID is now quarantined and all messages to this UID will be delivered to dead letters. Remote ActorSystem must be restarted to recover from this situation. Reason: Cluster member removed, previous status [Down] 2026-02-08T02:48:01,049 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | Materializer | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | [outbound connection to [pekko://opendaylight-cluster-data@10.30.171.87:2550], message stream] Upstream failed, cause: StreamTcpException: Tcp command [Connect(10.30.171.87/:2550,None,List(),Some(5000 milliseconds),true)] failed because of java.net.ConnectException: Connection refused 2026-02-08T02:48:01,683 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Follower Node2" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Restart Follower Node2 2026-02-08T02:48:01,902 | INFO | pipe-log:log "ROBOT MESSAGE: Starting ODL3 10.30.171.87" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting ODL3 10.30.171.87 2026-02-08T02:48:05,707 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | Candidate | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Candidate): Term 10 in "RequestVote{term=10, candidateId=member-2-shard-topology-config, lastLogIndex=-1, lastLogTerm=-1}" message is greater than Candidate's term 9 - switching to Follower 2026-02-08T02:48:05,713 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | RaftActorBehavior | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config (Candidate) :- Switching from behavior Candidate to Follower, election term: 10 2026-02-08T02:48:05,713 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | RoleChangeNotifier | 193 - org.opendaylight.controller.sal-clustering-commons - 12.0.3 | RoleChangeNotifier for member-1-shard-topology-config , received role change from Candidate to Follower 2026-02-08T02:48:05,713 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received role changed for member-1-shard-topology-config from Candidate to Follower 2026-02-08T02:48:05,714 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-topology-config status sync done false 2026-02-08T02:48:05,715 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received LeaderStateChanged message: org.opendaylight.controller.cluster.datastore.messages.ShardLeaderStateChanged@55c915b3 2026-02-08T02:48:06,233 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config Received follower initial sync status for member-1-shard-topology-config status sync done true 2026-02-08T02:48:08,489 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Received InitJoin message from [Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#1363231449]] to [pekko://opendaylight-cluster-data@10.30.170.226:2550] 2026-02-08T02:48:08,489 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Sending InitJoinAck message from node [pekko://opendaylight-cluster-data@10.30.170.226:2550] to [Actor[pekko://opendaylight-cluster-data@10.30.171.87:2550/system/cluster/core/daemon/joinSeedNodeProcess-1#1363231449]] (version [1.2.1]) 2026-02-08T02:48:08,538 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Node [pekko://opendaylight-cluster-data@10.30.171.87:2550] is JOINING, roles [member-3, dc-default], version [0.0.0] 2026-02-08T02:48:08,822 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | Cluster | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | Cluster Node [pekko://opendaylight-cluster-data@10.30.170.226:2550] - Leader is moving node [pekko://opendaylight-cluster-data@10.30.171.87:2550] to [Up] 2026-02-08T02:48:08,823 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.87:2550 2026-02-08T02:48:08,823 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-default-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-default-operational 2026-02-08T02:48:08,823 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-topology-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-topology-operational 2026-02-08T02:48:08,824 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-inventory-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2026-02-08T02:48:08,824 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-operational: Peer address for peer member-3-shard-topology-operational set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-topology-operational 2026-02-08T02:48:08,824 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-toaster-operational with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2026-02-08T02:48:08,824 | INFO | opendaylight-cluster-data-shard-dispatcher-36 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-operational: Peer address for peer member-3-shard-default-operational set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-default-operational 2026-02-08T02:48:08,824 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Peer address for peer member-3-shard-inventory-operational set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-inventory-operational 2026-02-08T02:48:08,824 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational: Peer address for peer member-3-shard-toaster-operational set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-operational/member-3-shard-toaster-operational 2026-02-08T02:48:08,824 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-operational: All Shards are ready - data store operational is ready 2026-02-08T02:48:08,824 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardManager | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | shard-manager-config: Received MemberUp: memberName: MemberName{name=member-3}, address: pekko://opendaylight-cluster-data@10.30.171.87:2550 2026-02-08T02:48:08,824 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-default-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-default-config 2026-02-08T02:48:08,825 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-topology-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-topology-config 2026-02-08T02:48:08,825 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-default-config: Peer address for peer member-3-shard-default-config set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-default-config 2026-02-08T02:48:08,825 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-inventory-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-inventory-config 2026-02-08T02:48:08,825 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-topology-config: Peer address for peer member-3-shard-topology-config set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-topology-config 2026-02-08T02:48:08,825 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | ShardInformation | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | updatePeerAddress for peer member-3-shard-toaster-config with address pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-toaster-config 2026-02-08T02:48:08,825 | INFO | opendaylight-cluster-data-shard-dispatcher-39 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Peer address for peer member-3-shard-inventory-config set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-inventory-config 2026-02-08T02:48:08,825 | INFO | opendaylight-cluster-data-shard-dispatcher-33 | PeerInfos | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-config: Peer address for peer member-3-shard-toaster-config set to pekko://opendaylight-cluster-data@10.30.171.87:2550/user/shardmanager-config/member-3-shard-toaster-config 2026-02-08T02:48:11,832 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:48:12,351 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2026-02-08T02:48:12,351 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2026-02-08T02:48:12,426 | WARN | opendaylight-cluster-data-shard-dispatcher-32 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=7, success=false, followerId=member-3-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 26527, lastApplied : 605, commitIndex : 605 2026-02-08T02:48:12,427 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=7, success=false, followerId=member-3-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 602, snapshotTerm: 7, replicatedToAllIndex: 602 2026-02-08T02:48:12,427 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): follower member-3-shard-inventory-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2026-02-08T02:48:12,427 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): Initiating install snapshot to follower member-3-shard-inventory-operational: follower nextIndex: 0, leader snapshotIndex: 602, leader lastIndex: 605, leader log size: 3 2026-02-08T02:48:12,427 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=605, lastAppliedTerm=7, lastIndex=605, lastTerm=7, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=0, mandatoryTrim=false] to install on member-3-shard-inventory-operational 2026-02-08T02:48:12,429 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Persising snapshot at EntryInfo[index=605, term=7]/EntryInfo[index=605, term=7] 2026-02-08T02:48:12,429 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: Removed in-memory snapshotted entries, adjusted snapshotIndex: 602 and term: 7 2026-02-08T02:48:12,435 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational: snapshot is durable as of 2026-02-08T02:48:12.429880641Z 2026-02-08T02:48:12,473 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): handleAppendEntriesReply - received unsuccessful reply: AppendEntriesReply{term=7, success=false, followerId=member-3-shard-inventory-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, leader snapshotIndex: 602, snapshotTerm: 7, replicatedToAllIndex: 602 2026-02-08T02:48:12,474 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): follower member-3-shard-inventory-operational appears to be behind the leader from the last snapshot - updated: matchIndex: -1, nextIndex: 0 2026-02-08T02:48:12,553 | WARN | opendaylight-cluster-data-shard-dispatcher-32 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-toaster-operational (Leader) : handleAppendEntriesReply delayed beyond election timeout, appendEntriesReply : AppendEntriesReply{term=7, success=true, followerId=member-3-shard-toaster-operational, logLastIndex=-1, logLastTerm=-1, forceInstallSnapshot=false, needsLeaderAddress=false, payloadVersion=13, raftVersion=5, recipientRaftVersion=5}, timeSinceLastActivity : 27110, lastApplied : -1, commitIndex : -1 2026-02-08T02:48:12,635 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | AbstractLeader | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-operational (Leader): Snapshot successfully installed on follower member-3-shard-inventory-operational (last chunk 1) - matchIndex set to 605, nextIndex set to 606 2026-02-08T02:48:12,856 | INFO | node-cleaner-0 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Try to remove device openflow:1 from operational DS 2026-02-08T02:48:14,429 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | Shard | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | member-1-shard-inventory-operational: retiring state Enabled{clientId=ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=1}, nanosAgo=28519178504, purgedHistories=MutableUnsignedLongSet{size=0}}, outdated by request from client ClientIdentifier{frontend=member-3-frontend-datastore-operational, generation=2} 2026-02-08T02:48:25,257 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Follower Node2 Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Data Recovery After Follower Node2 Restart 2026-02-08T02:48:32,871 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:48:53,911 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:49:03,622 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3035-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.028268464 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3035-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.028268464 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3035-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.5}]} timed out after 120.028268464 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:49:14,951 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:49:21,262 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3036-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.017528089 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3036-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.017528089 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3036-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.017528089 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:49:35,992 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:49:38,752 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3037-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.026513994 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3037-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.026513994 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3037-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.026513994 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:49:56,162 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3038-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.01565147 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3038-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.01565147 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3038-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.01565147 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:49:57,021 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:50:10,410 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=199847, lastAppliedTerm=6, lastIndex=200590, lastTerm=6, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=743, mandatoryTrim=false] 2026-02-08T02:50:10,412 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=199847, term=6]/EntryInfo[index=200590, term=6] 2026-02-08T02:50:10,413 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 199846 and term: 6 2026-02-08T02:50:10,418 | INFO | opendaylight-cluster-data-shard-dispatcher-34 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: snapshot is durable as of 2026-02-08T02:50:10.413045258Z 2026-02-08T02:50:18,061 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:50:39,102 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:51:00,141 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:51:03,652 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3039-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.02643722 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3039-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.02643722 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3039-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.6}]} timed out after 120.02643722 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:51:21,181 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:51:21,282 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3040-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.016651238 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3040-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.016651238 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3040-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.016651238 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:51:38,782 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3041-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.026736631 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3041-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.026736631 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3041-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.026736631 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:51:42,221 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:51:56,191 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3042-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.02562041 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3042-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.02562041 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3042-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.02562041 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:52:03,261 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:52:24,301 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:52:45,332 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:53:03,672 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3043-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.01553627 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3043-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.01553627 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3043-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.7}]} timed out after 120.01553627 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:53:06,372 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:53:21,303 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3044-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.016402009 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3044-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.016402009 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3044-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.016402009 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:53:27,411 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:53:38,792 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3045-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.0067987 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3045-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.0067987 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3045-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.0067987 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:53:48,452 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:53:56,223 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3046-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.027696957 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3046-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.027696957 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3046-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.027696957 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:54:09,491 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:54:30,521 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:54:51,561 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:55:03,693 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3047-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.017567082 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3047-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.017567082 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3047-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.8}]} timed out after 120.017567082 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:55:12,601 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:55:14,829 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node2" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Start Mininet Again Connect To Follower Node2 2026-02-08T02:55:18,213 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Follower Node2 Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify Flows In Switch After Follower Node2 Restart 2026-02-08T02:55:18,701 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2026-02-08T02:55:18,862 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_CHANGED [wasOwner=false, isOwner=false, hasOwner=true] 2026-02-08T02:55:19,346 | INFO | opendaylight-cluster-data-notification-dispatcher-124 | ConnectionManagerImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Clearing the device connection timer for the device 1 2026-02-08T02:55:21,332 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3048-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.026673527 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3048-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.026673527 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3048-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.026673527 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:55:33,641 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:55:38,824 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3049-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.027269537 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3049-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.027269537 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3049-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.027269537 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:55:54,681 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:55:56,232 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3050-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.005666366 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3050-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.005666366 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3050-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.005666366 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:56:15,721 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:56:36,762 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:56:57,801 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:56:58,755 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Stop Mininet Connected To Follower Node2 2026-02-08T02:56:59,291 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Entity ownership change received for node : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2026-02-08T02:56:59,291 | INFO | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | DeviceOwnershipServiceImpl | 299 - org.opendaylight.openflowplugin.applications.device-ownership-service - 0.21.2 | Entity ownership change received for node : openflow:1 : REMOTE_OWNERSHIP_LOST_NO_OWNER [wasOwner=false, isOwner=false, hasOwner=false] 2026-02-08T02:56:59,797 | INFO | node-cleaner-2 | ContextChainHolderImpl | 310 - org.opendaylight.openflowplugin.impl - 0.21.2 | Try to remove device openflow:1 from operational DS 2026-02-08T02:57:01,566 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node 2" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Delete All Flows From Follower Node 2 2026-02-08T02:57:02,100 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Follower Node2 Restart" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster HA Data Recovery BulkFlow Single Switch.Verify No Flows In Cluster After Follower Node2 Restart 2026-02-08T02:57:03,722 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3051-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.024712457 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3051-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.024712457 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3051-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.9}]} timed out after 120.024712457 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:57:18,841 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:57:21,362 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3052-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.027875261 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3052-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.027875261 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3052-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.027875261 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:57:38,853 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3053-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.024369078 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3053-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.024369078 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3053-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.024369078 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:57:39,881 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:57:56,261 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3054-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.026075518 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3054-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.026075518 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3054-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.026075518 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:58:00,921 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:58:21,951 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:58:42,991 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:58:45,479 | INFO | sshd-SshServer[4d81e27f](port=8101)-nio2-thread-2 | ServerSessionImpl | 126 - org.apache.sshd.osgi - 2.15.0 | Session karaf@/10.30.170.170:46200 authenticated 2026-02-08T02:58:46,286 | INFO | pipe-log:log "ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/040__Cluster_Current_Term_Verification_3Node_Cluster.robot" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting suite /w/workspace/openflowplugin-csit-3node-clustering-bulkomatic-only-vanadium/test/csit/suites/openflowplugin/Clustering_Bulkomatic/040__Cluster_Current_Term_Verification_3Node_Cluster.robot 2026-02-08T02:58:46,687 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Check Shard And Get Inventory" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Check Shard And Get Inventory 2026-02-08T02:58:51,677 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Initial Current Term Verification" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Initial Current Term Verification 2026-02-08T02:58:52,105 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Add Bulk Flow From Follower" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Add Bulk Flow From Follower 2026-02-08T02:58:52,389 | INFO | qtp1455505446-541 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Ping Pong Flow Tester Impl 2026-02-08T02:58:52,390 | INFO | qtp1455505446-541 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Transaction Chain Flow Writer Impl 2026-02-08T02:59:03,752 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3055-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.026035906 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3055-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.026035906 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3055-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.10}]} timed out after 120.026035906 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:59:04,032 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:59:21,391 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3056-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.026495397 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3056-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.026495397 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3056-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.026495397 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:59:25,072 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:59:38,872 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3057-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.016414864 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3057-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.016414864 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3057-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.016414864 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T02:59:46,112 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T02:59:56,281 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3058-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.016990739 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3058-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.016990739 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3058-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.016990739 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:00:07,151 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:00:28,191 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:00:49,232 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:01:03,782 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3059-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.026400003 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3059-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.026400003 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3059-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.11}]} timed out after 120.026400003 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:01:10,271 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:01:21,413 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3060-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.017758737 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3060-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.017758737 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3060-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.017758737 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:01:31,311 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:01:38,902 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3061-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.027535362 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3061-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.027535362 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3061-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.027535362 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:01:52,342 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:01:56,302 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3062-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}]} timed out after 120.01811453 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3062-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}]} timed out after 120.01811453 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3062-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}]} timed out after 120.01811453 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:02:13,381 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:02:34,421 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:02:55,461 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:03:03,801 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3063-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.01682428 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3063-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.01682428 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3063-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.12}]} timed out after 120.01682428 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:03:16,492 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:03:21,441 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3064-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.026010371 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3064-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.026010371 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3064-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.026010371 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:03:37,531 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:03:38,932 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3065-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}]} timed out after 120.027370133 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3065-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}]} timed out after 120.027370133 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3065-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}]} timed out after 120.027370133 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:03:56,332 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3066-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}]} timed out after 120.027328421 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3066-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}]} timed out after 120.027328421 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3066-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}]} timed out after 120.027328421 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:03:58,572 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:04:19,611 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:04:40,651 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:05:01,691 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:05:03,822 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3067-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.017840758 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3067-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.017840758 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3067-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.13}]} timed out after 120.017840758 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:05:21,463 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3068-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}]} timed out after 120.01765631 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3068-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}]} timed out after 120.01765631 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3068-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}]} timed out after 120.01765631 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:05:22,731 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:05:33,543 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Get Bulk Flows And Verify In Cluster" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Get Bulk Flows And Verify In Cluster 2026-02-08T03:05:37,916 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=219571, lastAppliedTerm=6, lastIndex=220571, lastTerm=6, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=1000, mandatoryTrim=false] 2026-02-08T03:05:37,919 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=219571, term=6]/EntryInfo[index=220571, term=6] 2026-02-08T03:05:37,920 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 218953 and term: 6 2026-02-08T03:05:37,934 | INFO | opendaylight-cluster-data-shard-dispatcher-32 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: snapshot is durable as of 2026-02-08T03:05:37.920390282Z 2026-02-08T03:05:38,961 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3069-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}]} timed out after 120.026222833 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3069-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}]} timed out after 120.026222833 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3069-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}]} timed out after 120.026222833 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:05:43,773 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:05:46,736 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Adding Bulk Flow" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Adding Bulk Flow 2026-02-08T03:05:47,233 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before And After Addition Of Flow" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before And After Addition Of Flow 2026-02-08T03:05:47,662 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete and Add ten percent of the flows for 5 iterations" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete and Add ten percent of the flows for 5 iterations 2026-02-08T03:05:48,043 | INFO | qtp1455505446-541 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Ping Pong Flow Tester Impl 2026-02-08T03:05:48,044 | INFO | qtp1455505446-541 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Transaction Chain Flow Writer Impl 2026-02-08T03:05:56,362 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3070-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}]} timed out after 120.026549397 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3070-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}]} timed out after 120.026549397 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3070-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}]} timed out after 120.026549397 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:06:04,811 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:06:25,852 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:06:46,891 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:07:03,842 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3071-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}]} timed out after 120.016531355 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3071-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}]} timed out after 120.016531355 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3071-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.14}]} timed out after 120.016531355 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:07:07,931 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:07:21,491 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3072-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}]} timed out after 120.024339782 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3072-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}]} timed out after 120.024339782 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3072-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}]} timed out after 120.024339782 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:07:28,972 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:07:38,982 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3073-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}]} timed out after 120.016507349 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3073-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}]} timed out after 120.016507349 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3073-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}]} timed out after 120.016507349 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:07:50,011 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:07:56,393 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3074-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}]} timed out after 120.027253562 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3074-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}]} timed out after 120.027253562 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3074-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}]} timed out after 120.027253562 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:08:11,052 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:08:32,091 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:08:53,131 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:09:03,871 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3075-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}]} timed out after 120.026171568 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3075-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}]} timed out after 120.026171568 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3075-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.15}]} timed out after 120.026171568 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:09:14,172 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:09:21,522 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3076-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}]} timed out after 120.028032073 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3076-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}]} timed out after 120.028032073 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3076-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}]} timed out after 120.028032073 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:09:35,212 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:09:38,994 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3077-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}]} timed out after 120.007222202 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3077-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}]} timed out after 120.007222202 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3077-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}]} timed out after 120.007222202 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:09:56,252 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:09:56,412 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3078-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}]} timed out after 120.016584333 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3078-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}]} timed out after 120.016584333 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3078-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}]} timed out after 120.016584333 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:10:17,291 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:10:38,332 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:10:59,372 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:11:03,892 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3079-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}]} timed out after 120.017529658 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3079-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}]} timed out after 120.017529658 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3079-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.16}]} timed out after 120.017529658 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:11:20,412 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:11:21,145 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | PhiAccrualFailureDetector | 189 - org.opendaylight.controller.repackaged-pekko - 12.0.3 | heartbeat interval is growing too large for address pekko://opendaylight-cluster-data@10.30.170.53:2550: 2021 millis 2026-02-08T03:11:21,552 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3080-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}]} timed out after 120.026279387 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3080-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}]} timed out after 120.026279387 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3080-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}]} timed out after 120.026279387 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:11:39,022 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3081-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}]} timed out after 120.022675674 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3081-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}]} timed out after 120.022675674 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3081-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}]} timed out after 120.022675674 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:11:41,451 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:11:56,432 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3082-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}]} timed out after 120.016315228 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3082-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}]} timed out after 120.016315228 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3082-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}]} timed out after 120.016315228 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:12:02,491 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:12:23,531 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:12:28,580 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Continuous Deletion and Addition Of Flows for 5 iterations" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Verification After Continuous Deletion and Addition Of Flows for 5 iterations 2026-02-08T03:12:29,183 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before and After Continuous Deletion and Addition Of Flows for 5 iterations" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Current Term Comparison Before and After Continuous Deletion and Addition Of Flows for 5 iterations 2026-02-08T03:12:29,680 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete All Flows From Follower Node" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Delete All Flows From Follower Node 2026-02-08T03:12:29,996 | INFO | qtp1455505446-360 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Ping Pong Flow Tester Impl 2026-02-08T03:12:29,996 | INFO | qtp1455505446-360 | FlowWriterTxChain | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Using Transaction Chain Flow Writer Impl 2026-02-08T03:12:44,573 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:13:03,903 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3083-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}]} timed out after 120.007668114 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3083-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}]} timed out after 120.007668114 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3083-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.17}]} timed out after 120.007668114 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:13:05,611 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:13:21,573 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3084-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}]} timed out after 120.017437666 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3084-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}]} timed out after 120.017437666 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3084-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}]} timed out after 120.017437666 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:13:26,651 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:13:39,052 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3085-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}]} timed out after 120.026232191 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3085-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}]} timed out after 120.026232191 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3085-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}]} timed out after 120.026232191 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:13:47,692 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:13:56,462 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3086-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}]} timed out after 120.026752826 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3086-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}]} timed out after 120.026752826 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3086-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}]} timed out after 120.026752826 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:14:08,732 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:14:29,772 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:14:50,812 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:15:03,931 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3087-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}]} timed out after 120.025107826 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3087-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}]} timed out after 120.025107826 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3087-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.18}]} timed out after 120.025107826 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:15:11,852 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:15:21,601 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3088-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}]} timed out after 120.025521838 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3088-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}]} timed out after 120.025521838 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3088-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}]} timed out after 120.025521838 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:15:32,891 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:15:39,072 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3089-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}]} timed out after 120.016428604 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3089-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}]} timed out after 120.016428604 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3089-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}]} timed out after 120.016428604 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:15:53,931 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:15:56,491 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3090-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}]} timed out after 120.024781567 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3090-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}]} timed out after 120.024781567 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3090-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}]} timed out after 120.024781567 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:16:14,971 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:16:36,012 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:16:57,051 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:17:03,962 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3091-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}]} timed out after 120.027370405 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3091-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}]} timed out after 120.027370405 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3091-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.19}]} timed out after 120.027370405 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:17:18,082 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:17:21,622 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3092-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}]} timed out after 120.017377305 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3092-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}]} timed out after 120.017377305 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3092-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}]} timed out after 120.017377305 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:17:39,102 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3093-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}]} timed out after 120.027015676 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3093-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}]} timed out after 120.027015676 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3093-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}]} timed out after 120.027015676 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:17:39,122 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:17:56,512 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3094-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}]} timed out after 120.017737161 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3094-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}]} timed out after 120.017737161 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3094-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}]} timed out after 120.017737161 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:18:00,162 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:18:21,201 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-5 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:18:42,242 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:19:03,281 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:19:03,992 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3095-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}]} timed out after 120.027246248 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3095-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}]} timed out after 120.027246248 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3095-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.20}]} timed out after 120.027246248 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:19:10,774 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Verify No Flows In Cluster After Flow Deletion" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.Verify No Flows In Cluster After Flow Deletion 2026-02-08T03:19:15,136 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Initiating snapshot capture CaptureSnapshot [lastAppliedIndex=239440, lastAppliedTerm=6, lastIndex=240425, lastTerm=6, installSnapshotInitiated=, replicatedToAllIndex=-1, replicatedToAllTerm=-1, unAppliedEntries size=985, mandatoryTrim=false] 2026-02-08T03:19:15,138 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Persising snapshot at EntryInfo[index=239440, term=6]/EntryInfo[index=240425, term=6] 2026-02-08T03:19:15,139 | INFO | opendaylight-cluster-data-shard-dispatcher-38 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: Removed in-memory snapshotted entries, adjusted snapshotIndex: 239439 and term: 6 2026-02-08T03:19:15,149 | INFO | opendaylight-cluster-data-shard-dispatcher-35 | SnapshotManager | 190 - org.opendaylight.controller.sal-akka-raft - 12.0.3 | member-1-shard-inventory-config: snapshot is durable as of 2026-02-08T03:19:15.139147522Z 2026-02-08T03:19:21,652 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3096-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}]} timed out after 120.026164858 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3096-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}]} timed out after 120.026164858 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3096-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}]} timed out after 120.026164858 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:19:24,321 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:19:27,158 | INFO | pipe-log:log "ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.PreLeader Verification" | core | 113 - org.apache.karaf.log.core - 4.4.8 | ROBOT MESSAGE: Starting test openflowplugin-clustering-bulkomatic.txt.Cluster Current Term Verification 3Node Cluster.PreLeader Verification 2026-02-08T03:19:27,548 | INFO | qtp1455505446-541 | TableWriter | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Starting to add tables: 0 to 9 on each of 2 2026-02-08T03:19:39,133 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3097-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}]} timed out after 120.026529432 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3097-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}]} timed out after 120.026529432 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3097-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}]} timed out after 120.026529432 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:19:45,362 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:19:56,532 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3098-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}]} timed out after 120.01688551 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3098-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}]} timed out after 120.01688551 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3098-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}]} timed out after 120.01688551 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:20:06,401 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:20:27,441 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:20:48,482 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:21:04,021 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3099-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}]} timed out after 120.027212727 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3099-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}]} timed out after 120.027212727 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3099-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.21}]} timed out after 120.027212727 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:21:09,521 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:21:21,672 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3100-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}]} timed out after 120.016476784 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3100-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}]} timed out after 120.016476784 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3100-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}]} timed out after 120.016476784 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:21:30,561 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:21:39,161 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3101-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}]} timed out after 120.024795205 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3101-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}]} timed out after 120.024795205 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3101-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}]} timed out after 120.024795205 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:21:51,602 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:21:56,561 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3102-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}]} timed out after 120.026854247 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3102-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}]} timed out after 120.026854247 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3102-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}]} timed out after 120.026854247 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:22:12,641 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-31 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:22:33,681 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:22:54,721 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:23:04,051 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3103-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}]} timed out after 120.0277477 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3103-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}]} timed out after 120.0277477 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3103-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.22}]} timed out after 120.0277477 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:23:15,761 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-14 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:23:21,702 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3104-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}]} timed out after 120.027891174 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3104-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}]} timed out after 120.027891174 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3104-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}]} timed out after 120.027891174 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:23:36,791 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:23:39,182 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3105-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}]} timed out after 120.017848086 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3105-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}]} timed out after 120.017848086 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3105-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}]} timed out after 120.017848086 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:23:56,582 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.25}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.25}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3106-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.25}]} timed out after 120.018011684 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.25}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3106-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.25}]} timed out after 120.018011684 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3106-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.25}]} timed out after 120.018011684 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:23:57,831 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:24:18,871 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-13 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:24:39,912 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:25:00,951 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-30 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:25:04,072 | ERROR | ForkJoinPool-9-worker-4 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3107-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}]} timed out after 120.018096249 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3107-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}]} timed out after 120.018096249 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3107-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.23}]} timed out after 120.018096249 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:25:21,732 | ERROR | ForkJoinPool-9-worker-3 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3108-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}]} timed out after 120.027517136 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3108-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}]} timed out after 120.027517136 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3108-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.24}]} timed out after 120.027517136 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:25:21,991 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-40 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:25:39,211 | ERROR | ForkJoinPool-9-worker-2 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.25}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.25}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3109-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.25}]} timed out after 120.027152089 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.25}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3109-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.25}]} timed out after 120.027152089 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3109-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=0}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.0.25}]} timed out after 120.027152089 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:25:43,021 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:25:56,612 | ERROR | ForkJoinPool-9-worker-1 | FlowReader | 298 - org.opendaylight.openflowplugin.applications.bulk-o-matic - 0.21.2 | Error java.util.concurrent.ExecutionException: ReadFailedException{message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.26}], errorList=[RpcError [message=Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.26}], severity=ERROR, errorType=APPLICATION, tag=operation-failed, applicationTag=null, info=null, cause=org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3110-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.26}]} timed out after 120.027096217 seconds. The backend for inventory is not available.]]} at com.google.common.util.concurrent.AbstractFuture.getDoneValue(AbstractFuture.java:292) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFutureState.blockingGet(AbstractFutureState.java:255) ~[bundleFile:?] at com.google.common.util.concurrent.Platform.get(Platform.java:54) ~[bundleFile:?] at com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:253) ~[bundleFile:?] at com.google.common.util.concurrent.FluentFuture$TrustedFuture.get(FluentFuture.java:91) ~[bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.readFlowsX(FlowReader.java:89) [bundleFile:?] at org.opendaylight.openflowplugin.applications.bulk.o.matic.FlowReader.run(FlowReader.java:70) [bundleFile:?] at java.util.concurrent.ForkJoinTask$RunnableExecuteAction.exec(ForkJoinTask.java:1423) [?:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.mdsal.common.api.ReadFailedException: Error reading data for path /(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.26}] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.mdsal.common.api.DataStoreUnavailableException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3110-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.26}]} timed out after 120.027096217 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.recordFailedResponse(RemoteProxyTransaction.java:206) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.failReadFuture(RemoteProxyTransaction.java:221) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.completeRead(RemoteProxyTransaction.java:244) ~[?:?] at org.opendaylight.controller.cluster.databroker.actors.dds.RemoteProxyTransaction.lambda$doRead$1(RemoteProxyTransaction.java:138) ~[?:?] at org.opendaylight.controller.cluster.access.client.ConnectionEntry.complete(ConnectionEntry.java:58) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:443) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more Caused by: org.opendaylight.controller.cluster.access.client.RequestTimeoutException: ReadTransactionRequest{target=member-1-datastore-config-fe-1-txn-3110-1, sequence=0, replyTo=Actor[pekko://opendaylight-cluster-data/user/$a#1780294568], snapshotOnly=true, path=/(urn:opendaylight:inventory?revision=2013-08-19)nodes/node/node[{(urn:opendaylight:inventory?revision=2013-08-19)id=openflow:1}]/(urn:opendaylight:flow:inventory?revision=2013-08-19)table/table[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=1}]/flow/flow[{(urn:opendaylight:flow:inventory?revision=2013-08-19)id=Flow-openflow:1.1.26}]} timed out after 120.027096217 seconds. The backend for inventory is not available. at org.opendaylight.controller.cluster.access.client.AbstractClientConnection.lambda$timeoutEntry$0(AbstractClientConnection.java:444) ~[?:?] at org.opendaylight.controller.cluster.access.client.ClientActorBehavior.onReceiveCommand(ClientActorBehavior.java:195) ~[?:?] at org.opendaylight.controller.cluster.access.client.AbstractClientActor.onReceiveCommand(AbstractClientActor.java:134) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[?:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[?:?] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[?:?] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[?:?] at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:269) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[?:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[?:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[?:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[?:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[?:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[?:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[?:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[?:?] ... 5 more 2026-02-08T03:26:04,061 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-41 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more 2026-02-08T03:26:25,101 | WARN | opendaylight-cluster-data-pekko.actor.default-dispatcher-42 | AbstractShardBackendResolver | 195 - org.opendaylight.controller.sal-distributed-datastore - 12.0.3 | Failed to resolve shard java.util.concurrent.TimeoutException: Shard has no current leader at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver.wrap(AbstractShardBackendResolver.java:167) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:145) ~[bundleFile:?] at org.opendaylight.controller.cluster.databroker.actors.dds.AbstractShardBackendResolver$1.onComplete(AbstractShardBackendResolver.java:138) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:337) ~[bundleFile:?] at org.apache.pekko.dispatch.OnComplete.internal(Future.scala:336) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:259) ~[bundleFile:?] at org.apache.pekko.dispatch.japi$CallbackBridge.apply(Future.scala:258) ~[bundleFile:?] at scala.concurrent.impl.Promise$Transformation.run(Promise.scala:484) ~[bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$AbstractBatch.processBatch(BatchingExecutor.scala:72) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$1(BatchingExecutor.scala:109) ~[bundleFile:?] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run$$anonfun$adapted$1(BatchingExecutor.scala:118) ~[bundleFile:?] at scala.concurrent.BlockContext$.withBlockContext(BlockContext.scala:94) [bundleFile:12.0.3] at org.apache.pekko.dispatch.BatchingExecutor$BlockableBatch.run(BatchingExecutor.scala:118) [bundleFile:?] at org.apache.pekko.dispatch.TaskInvocation.run(AbstractDispatcher.scala:59) [bundleFile:?] at org.apache.pekko.dispatch.ForkJoinExecutorConfigurator$PekkoForkJoinTask.exec(ForkJoinExecutorConfigurator.scala:62) [bundleFile:?] at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:387) [?:?] at java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1312) [?:?] at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1843) [?:?] at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1808) [?:?] at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:188) [?:?] Caused by: org.opendaylight.controller.cluster.datastore.exceptions.NoShardLeaderException: Shard member-1-shard-inventory-config currently has no leader. Try again later. at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.onShardNotInitializedTimeout(ShardManager.java:629) ~[bundleFile:?] at org.opendaylight.controller.cluster.datastore.shardmanager.ShardManager.handleReceive(ShardManager.java:253) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at org.apache.pekko.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:37) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at org.apache.pekko.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:33) ~[bundleFile:?] at scala.PartialFunction$OrElse.apply(PartialFunction.scala:266) ~[bundleFile:12.0.3] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:88) ~[bundleFile:?] at org.opendaylight.controller.cluster.common.actor.MeteringBehavior.apply(MeteringBehavior.java:27) ~[bundleFile:?] at scala.PartialFunction.applyOrElse(PartialFunction.scala:214) ~[bundleFile:12.0.3] at scala.PartialFunction.applyOrElse$(PartialFunction.scala:213) ~[bundleFile:12.0.3] at scala.runtime.AbstractPartialFunction.applyOrElse(AbstractPartialFunction.scala:27) ~[bundleFile:12.0.3] at org.apache.pekko.actor.Actor.aroundReceive(Actor.scala:547) ~[bundleFile:?] at org.apache.pekko.actor.Actor.aroundReceive$(Actor.scala:481) ~[bundleFile:?] at org.apache.pekko.actor.AbstractActor.aroundReceive(AbstractActor.scala:229) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.receiveMessage(ActorCell.scala:590) ~[bundleFile:?] at org.apache.pekko.actor.ActorCell.invoke(ActorCell.scala:557) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.processMailbox(Mailbox.scala:273) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.run(Mailbox.scala:234) ~[bundleFile:?] at org.apache.pekko.dispatch.Mailbox.exec(Mailbox.scala:246) ~[bundleFile:?] ... 5 more