Before I say anything, I think this community is great and im learning so much from this forum, but i may have bit off more than I can chew at this moment.
So i have a 3 node incus v(6.18) cluster with a seperate 3 node linstor cluster that i’ve setup for a class at my university and I have just experienced a total power outage. When i turned back my power I am having huge issues with restoring my containers from their respective state with linbit.
I have attached the error reports at the bottom as well.
I keep getting this issue where it says that it cant mount the drbd mount or that its stuck in read-only mode … I have reconnected all my nodes from the linstor controller and they connect fine and all show online, but im afraid that i cant I cant recover from this and its important data on there that i was waiting to backup. Can anyone shed some light on this? My professor is gonna kill me if I cant get this back up
, kinda freaking out.
If i go on the GUI i can only see the disks from the storage pool I setup for incus on one of the nodes, the others look empty. This may be from error-io on no quorum which i read about earlier.
I looked at dmesg and i see the following error for most of the mounts..
[ 5564.527381] drbd incus-volume-353e79f3b8e64416bf18b235dbf92cfd: no UpToDate peer with quorum
[ 5564.527384] drbd incus-volume-353e79f3b8e64416bf18b235dbf92cfd: Auto-promote failed: No quorum (-25)
[ 5564.527740] /dev/drbd1039: Can’t open blockdev
and also here is the output of linstor resource list
╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
┊ ResourceName ┊ Node ┊ Layers ┊ Usage ┊ Conns ┊ State ┊ CreatedOn ┊
╞═══════════════════════════════════════════════════════════════════════════════════════════════════════════════════════════════════════════════════════════════════╡
┊ incus-volume-0afe72c1ffbf47e882011fe28af797d9 ┊ gigabyte ┊ DRBD,STORAGE ┊ InUse ┊ Connecting(r730xd-2,r730xd-3) ┊ Diskless ┊ 2025-09-28 14:06:51 ┊
┊ incus-volume-0afe72c1ffbf47e882011fe28af797d9 ┊ r620 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-11-10 16:54:59 ┊
┊ incus-volume-0afe72c1ffbf47e882011fe28af797d9 ┊ r730xd-1 ┊ DRBD,STORAGE ┊ Unused ┊ Connecting(r730xd-2,r730xd-3,r620) ┊ UpToDate ┊ 2025-09-28 14:03:13 ┊
┊ incus-volume-0afe72c1ffbf47e882011fe28af797d9 ┊ r730xd-2 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-09-28 14:03:13 ┊
┊ incus-volume-0afe72c1ffbf47e882011fe28af797d9 ┊ r730xd-3 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-09-28 14:03:13 ┊
┊ incus-volume-0afe72c1ffbf47e882011fe28af797d9 ┊ supermicro ┊ DRBD,STORAGE ┊ Unused ┊ Connecting(r730xd-2,r730xd-3) ┊ Diskless ┊ 2025-11-10 16:57:31 ┊
┊ incus-volume-0cb1c42e43534509a8b59c96b2d44b49 ┊ gigabyte ┊ DRBD,STORAGE ┊ Unused ┊ Connecting(r730xd-2,r730xd-3) ┊ Diskless ┊ 2025-11-10 17:38:48 ┊
┊ incus-volume-0cb1c42e43534509a8b59c96b2d44b49 ┊ r620 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-11-10 17:38:39 ┊
┊ incus-volume-0cb1c42e43534509a8b59c96b2d44b49 ┊ r730xd-1 ┊ DRBD,STORAGE ┊ Unused ┊ Connecting(r730xd-2,r730xd-3,r620) ┊ UpToDate ┊ 2025-09-28 14:01:43 ┊
┊ incus-volume-0cb1c42e43534509a8b59c96b2d44b49 ┊ r730xd-2 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-09-28 14:01:43 ┊
┊ incus-volume-0cb1c42e43534509a8b59c96b2d44b49 ┊ r730xd-3 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-09-28 14:01:43 ┊
┊ incus-volume-0cb1c42e43534509a8b59c96b2d44b49 ┊ supermicro ┊ DRBD,STORAGE ┊ Unused ┊ Connecting(r730xd-2,r730xd-3) ┊ Diskless ┊ 2025-10-01 20:05:03 ┊
┊ incus-volume-1cd2c1abd5db4fadbd24d3d7ede8c2ce ┊ gigabyte ┊ DRBD,STORAGE ┊ InUse ┊ Connecting(r730xd-2,r730xd-3) ┊ Diskless ┊ 2025-10-01 20:12:30 ┊
┊ incus-volume-1cd2c1abd5db4fadbd24d3d7ede8c2ce ┊ r620 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-11-10 16:54:59 ┊
┊ incus-volume-1cd2c1abd5db4fadbd24d3d7ede8c2ce ┊ r730xd-1 ┊ DRBD,STORAGE ┊ Unused ┊ Connecting(r730xd-2,r730xd-3,r620) ┊ UpToDate ┊ 2025-09-29 16:18:38 ┊
┊ incus-volume-1cd2c1abd5db4fadbd24d3d7ede8c2ce ┊ r730xd-2 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-09-29 16:18:38 ┊
┊ incus-volume-1cd2c1abd5db4fadbd24d3d7ede8c2ce ┊ r730xd-3 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-09-29 16:18:38 ┊
┊ incus-volume-1cd2c1abd5db4fadbd24d3d7ede8c2ce ┊ supermicro ┊ DRBD,STORAGE ┊ Unused ┊ Connecting(r730xd-2,r730xd-3) ┊ Diskless ┊ 2025-11-10 16:57:32 ┊
┊ incus-volume-3eca117a33cf4a1784bcb5d7e40411a8 ┊ gigabyte ┊ DRBD,STORAGE ┊ InUse ┊ Connecting(r730xd-2,r730xd-3) ┊ Diskless ┊ 2025-10-19 15:28:51 ┊
┊ incus-volume-3eca117a33cf4a1784bcb5d7e40411a8 ┊ r620 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-11-10 16:57:32 ┊
┊ incus-volume-3eca117a33cf4a1784bcb5d7e40411a8 ┊ r730xd-1 ┊ DRBD,STORAGE ┊ Unused ┊ Connecting(r730xd-2,r730xd-3,r620) ┊ UpToDate ┊ 2025-10-19 15:28:41 ┊
┊ incus-volume-3eca117a33cf4a1784bcb5d7e40411a8 ┊ r730xd-2 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-10-19 15:28:41 ┊
┊ incus-volume-3eca117a33cf4a1784bcb5d7e40411a8 ┊ r730xd-3 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-10-19 15:28:41 ┊
┊ incus-volume-3eca117a33cf4a1784bcb5d7e40411a8 ┊ supermicro ┊ DRBD,STORAGE ┊ Unused ┊ Connecting(r730xd-2,r730xd-3) ┊ Diskless ┊ 2025-11-10 16:57:41 ┊
┊ incus-volume-5bd43e17dd6b40d8a3a4bb1e5826c8d5 ┊ r620 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-11-10 17:37:47 ┊
┊ incus-volume-5bd43e17dd6b40d8a3a4bb1e5826c8d5 ┊ r730xd-1 ┊ DRBD,STORAGE ┊ Unused ┊ Connecting(r730xd-2,r730xd-3,r620) ┊ UpToDate ┊ 2025-09-28 14:01:16 ┊
┊ incus-volume-5bd43e17dd6b40d8a3a4bb1e5826c8d5 ┊ r730xd-2 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-09-28 14:01:15 ┊
┊ incus-volume-5bd43e17dd6b40d8a3a4bb1e5826c8d5 ┊ r730xd-3 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-09-28 14:01:16 ┊
┊ incus-volume-5bd43e17dd6b40d8a3a4bb1e5826c8d5 ┊ supermicro ┊ DRBD,STORAGE ┊ Unused ┊ Connecting(r730xd-2,r730xd-3) ┊ Diskless ┊ 2025-10-01 20:05:00 ┊
┊ incus-volume-6ced1b2804c24652ad90f8c0cb45aec5 ┊ gigabyte ┊ DRBD,STORAGE ┊ Unused ┊ Connecting(r730xd-2,r730xd-3) ┊ Diskless ┊ 2025-11-10 16:21:49 ┊
┊ incus-volume-6ced1b2804c24652ad90f8c0cb45aec5 ┊ r620 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-11-10 16:23:25 ┊
┊ incus-volume-6ced1b2804c24652ad90f8c0cb45aec5 ┊ r730xd-1 ┊ DRBD,STORAGE ┊ Unused ┊ Connecting(r730xd-2,r730xd-3,r620) ┊ UpToDate ┊ 2025-09-28 14:03:20 ┊
┊ incus-volume-6ced1b2804c24652ad90f8c0cb45aec5 ┊ r730xd-2 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-09-28 14:03:20 ┊
┊ incus-volume-6ced1b2804c24652ad90f8c0cb45aec5 ┊ r730xd-3 ┊ DRBD,STORAGE ┊ ┊ ┊ Unknown ┊ 2025-09-28 14:03:20 ┊
┊ incus-volume-6ced1b2804c24652ad90f8c0cb45aec5 ┊ supermicro ┊ DRBD,STORAGE ┊ Unused ┊ Connecting(r730xd-2,r730xd-3) ┊ Diskless ┊ 2025-11-10 16:12:03 ┊
the r730xd-1 is my linstor controller and the rest are satellites. Also, r730xd-2 and r730xd-3 are the other storage nodes… my incus nodes are gigabyte, r620 and supermicro.
this is the output of drbdadm status…
incus-volume-08c3e1c02f764335a6a874325b549ac6 role:Secondary
disk:UpToDate open:no
gigabyte role:Primary
peer-disk:Diskless
r730xd-2 connection:Connecting
r730xd-3 connection:Connecting
supermicro role:Secondary
peer-disk:Diskless
incus-volume-0afe72c1ffbf47e882011fe28af797d9 role:Secondary
disk:UpToDate open:no
gigabyte role:Primary
peer-disk:Diskless
r620 connection:Connecting
r730xd-2 connection:Connecting
r730xd-3 connection:Connecting
supermicro role:Secondary
peer-disk:Diskless
incus-volume-0cb1c42e43534509a8b59c96b2d44b49 role:Secondary suspended:quorum
disk:UpToDate quorum:no open:no blocked:upper
gigabyte role:Secondary
peer-disk:Diskless
r620 connection:Connecting
r730xd-2 connection:Connecting
r730xd-3 connection:Connecting
supermicro role:Secondary
peer-disk:Diskless
incus-volume-1184bdc40c084bf7911779109aabd75a role:Secondary
disk:UpToDate open:no
gigabyte role:Primary
peer-disk:Diskless
r620 connection:Connecting
r730xd-2 connection:Connecting
r730xd-3 connection:Connecting
supermicro role:Secondary
peer-disk:Diskless
incus-volume-13554e2ba2834d728327b411dbace40b role:Secondary suspended:quorum
disk:Outdated quorum:no open:no
gigabyte role:Secondary
peer-disk:Diskless
r620 connection:Connecting
r730xd-2 connection:Connecting
r730xd-3 connection:Connecting
supermicro role:Secondary
peer-disk:Diskless
incus-volume-15ddb08d4e8b4efe80156ef4ce5ded95 role:Secondary
disk:UpToDate open:no
gigabyte role:Primary
peer-disk:Diskless
r620 connection:Connecting
r730xd-2 connection:Connecting
r730xd-3 connection:Connecting
supermicro role:Secondary
peer-disk:Diskless
incus-volume-160bcab0d7e445aa89a05f267ad6eb3b role:Secondary suspended:quorum
disk:UpToDate quorum:no open:no blocked:upper
r620 connection:Connecting
r730xd-2 connection:Connecting
r730xd-3 connection:Connecting
supermicro role:Secondary
peer-disk:Diskless
incus-volume-18eb10b18c1749079efb3e3872b13823 role:Secondary suspended:quorum
disk:UpToDate quorum:no open:no blocked:upper
r730xd-2 connection:Connecting
r730xd-3 connection:Connecting
supermicro role:Secondary
peer-disk:Diskless
incus-volume-1cd2c1abd5db4fadbd24d3d7ede8c2ce role:Secondary
disk:UpToDate open:no
gigabyte role:Primary
peer-disk:Diskless
r620 connection:Connecting
r730xd-2 connection:Connecting
r730xd-3 connection:Connecting
supermicro role:Secondary
peer-disk:Diskless
incus-volume-26018aef7a47442c9e296d8cccfe173b role:Secondary
disk:UpToDate open:no
gigabyte connection:Connecting
r620 connection:Connecting
r730xd-2 connection:Connecting
r730xd-3 connection:Connecting
supermicro role:Secondary
peer-disk:Diskless
incus-volume-27eb7afa83684c56ac8ae2929eb4f492 role:Secondary suspended:quorum
disk:UpToDate quorum:no open:no blocked:upper
gigabyte role:Secondary
peer-disk:Diskless
r620 connection:Connecting
r730xd-2 connection:Connecting
r730xd-3 connection:Connecting
ERROR REPORT (From Controller)
ERROR REPORT 69128364-00000-000110
============================================================
Application: LINBIT® LINSTOR
Module: Controller
Version: 1.32.3
Build ID: 6dac06aed233f2c89ac7cc6b1185d6dce9ec74c4
Build time: 2025-10-13T06:37:58+00:00
Error time: 2025-11-10 20:53:32
Node: r730xd-1
Thread: MainWorkerPool-7
Access context information
Identity: PUBLIC
Role: PUBLIC
Domain: PUBLIC
Peer: RestClient(192.0.0.1; 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/142.0.0.0 Safari/537.36')
============================================================
Reported error:
===============
Category: RuntimeException
Class name: DelayedApiRcException
Class canonical name: com.linbit.linstor.core.apicallhandler.response.CtrlResponseUtils.DelayedApiRcException
Generated at: Method 'lambda$mergeExtractingApiRcExceptions$6', Source file 'CtrlResponseUtils.java', Line #187
Error message: Exceptions have been converted to responses
Error context:
Registration of resource group 'micron_pool' failed due to an unhandled exception of type DelayedApiRcException. Exceptions have been converted to responses
Asynchronous stage backtrace:
(r730xd-2) No response generated by handler.
(r730xd-3) No response generated by handler.
Error has been observed at the following site(s):
*__checkpoint ⇢ Modify resource-group
Original Stack Trace:
Call backtrace:
Method Native Class:Line number
lambda$mergeExtractingApiRcExceptions$6 N com.linbit.linstor.core.apicallhandler.response.CtrlResponseUtils:187
Suppressed exception 1 of 3:
===============
Category: RuntimeException
Class name: ApiRcException
Class canonical name: com.linbit.linstor.core.apicallhandler.response.ApiRcException
Generated at: Method 'handleAnswer', Source file 'CommonMessageProcessor.java', Line #344
Error message: (r730xd-2) No response generated by handler.
Error context:
Registration of resource group 'micron_pool' failed due to an unhandled exception of type DelayedApiRcException. Exceptions have been converted to responses
Call backtrace:
Method Native Class:Line number
handleAnswer N com.linbit.linstor.proto.CommonMessageProcessor:344
handleDataMessage N com.linbit.linstor.proto.CommonMessageProcessor:297
doProcessInOrderMessage N com.linbit.linstor.proto.CommonMessageProcessor:245
lambda$doProcessMessage$4 N com.linbit.linstor.proto.CommonMessageProcessor:230
subscribe N reactor.core.publisher.FluxDefer:46
subscribe N reactor.core.publisher.Flux:8848
onNext N reactor.core.publisher.FluxFlatMap$FlatMapMain:430
drainAsync N reactor.core.publisher.FluxFlattenIterable$FlattenIterableSubscriber:453
drain N reactor.core.publisher.FluxFlattenIterable$FlattenIterableSubscriber:724
onNext N reactor.core.publisher.FluxFlattenIterable$FlattenIterableSubscriber:256
drainFused N reactor.core.publisher.SinkManyUnicast:321
drain N reactor.core.publisher.SinkManyUnicast:363
tryEmitNext N reactor.core.publisher.SinkManyUnicast:239
tryEmitNext N reactor.core.publisher.SinkManySerialized:100
processInOrder N com.linbit.linstor.netcom.TcpConnectorPeer:446
doProcessMessage N com.linbit.linstor.proto.CommonMessageProcessor:228
lambda$processMessage$2 N com.linbit.linstor.proto.CommonMessageProcessor:165
onNext N reactor.core.publisher.FluxPeek$PeekSubscriber:185
runAsync N reactor.core.publisher.FluxPublishOn$PublishOnSubscriber:446
run N reactor.core.publisher.FluxPublishOn$PublishOnSubscriber:533
call N reactor.core.scheduler.WorkerTask:84
call N reactor.core.scheduler.WorkerTask:37
run N java.util.concurrent.FutureTask:317
run N java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask:304
runWorker N java.util.concurrent.ThreadPoolExecutor:1144
run N java.util.concurrent.ThreadPoolExecutor$Worker:642
run N java.lang.Thread:1583
Suppressed exception 2 of 3:
===============
Category: RuntimeException
Class name: ApiRcException
Class canonical name: com.linbit.linstor.core.apicallhandler.response.ApiRcException
Generated at: Method 'handleAnswer', Source file 'CommonMessageProcessor.java', Line #344
Error message: (r730xd-3) No response generated by handler.
Error context:
Registration of resource group 'micron_pool' failed due to an unhandled exception of type DelayedApiRcException. Exceptions have been converted to responses
Call backtrace:
Method Native Class:Line number
handleAnswer N com.linbit.linstor.proto.CommonMessageProcessor:344
handleDataMessage N com.linbit.linstor.proto.CommonMessageProcessor:297
doProcessInOrderMessage N com.linbit.linstor.proto.CommonMessageProcessor:245
lambda$doProcessMessage$4 N com.linbit.linstor.proto.CommonMessageProcessor:230
subscribe N reactor.core.publisher.FluxDefer:46
subscribe N reactor.core.publisher.Flux:8848
onNext N reactor.core.publisher.FluxFlatMap$FlatMapMain:430
drainAsync N reactor.core.publisher.FluxFlattenIterable$FlattenIterableSubscriber:453
drain N reactor.core.publisher.FluxFlattenIterable$FlattenIterableSubscriber:724
onNext N reactor.core.publisher.FluxFlattenIterable$FlattenIterableSubscriber:256
drainFused N reactor.core.publisher.SinkManyUnicast:321
drain N reactor.core.publisher.SinkManyUnicast:363
tryEmitNext N reactor.core.publisher.SinkManyUnicast:239
tryEmitNext N reactor.core.publisher.SinkManySerialized:100
processInOrder N com.linbit.linstor.netcom.TcpConnectorPeer:446
doProcessMessage N com.linbit.linstor.proto.CommonMessageProcessor:228
lambda$processMessage$2 N com.linbit.linstor.proto.CommonMessageProcessor:165
onNext N reactor.core.publisher.FluxPeek$PeekSubscriber:185
runAsync N reactor.core.publisher.FluxPublishOn$PublishOnSubscriber:446
run N reactor.core.publisher.FluxPublishOn$PublishOnSubscriber:533
call N reactor.core.scheduler.WorkerTask:84
call N reactor.core.scheduler.WorkerTask:37
run N java.util.concurrent.FutureTask:317
run N java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask:304
runWorker N java.util.concurrent.ThreadPoolExecutor:1144
run N java.util.concurrent.ThreadPoolExecutor$Worker:642
run N java.lang.Thread:1583
Suppressed exception 3 of 3:
===============
Category: RuntimeException
Class name: OnAssemblyException
Class canonical name: reactor.core.publisher.FluxOnAssembly.OnAssemblyException
Generated at: Method 'lambda$mergeExtractingApiRcExceptions$6', Source file 'CtrlResponseUtils.java', Line #187
Error message:
Error has been observed at the following site(s):
*__checkpoint ⇢ Modify resource-group
Original Stack Trace:
Error context:
Registration of resource group 'micron_pool' failed due to an unhandled exception of type DelayedApiRcException. Exceptions have been converted to responses
Call backtrace:
Method Native Class:Line number
lambda$mergeExtractingApiRcExceptions$6 N com.linbit.linstor.core.apicallhandler.response.CtrlResponseUtils:187
subscribe N reactor.core.publisher.FluxDefer:46
subscribe N reactor.core.publisher.Flux:8848
onComplete N reactor.core.publisher.FluxConcatArray$ConcatArraySubscriber:238
onComplete N reactor.core.publisher.FluxMap$MapSubscriber:144
checkTerminated N reactor.core.publisher.FluxFlatMap$FlatMapMain:850
drainLoop N reactor.core.publisher.FluxFlatMap$FlatMapMain:612
innerComplete N reactor.core.publisher.FluxFlatMap$FlatMapMain:898
onComplete N reactor.core.publisher.FluxFlatMap$FlatMapInner:1001
onComplete N reactor.core.publisher.Operators$MultiSubscriptionSubscriber:2230
onComplete N reactor.core.publisher.FluxMap$MapSubscriber:144
onComplete N reactor.core.publisher.FluxConcatArray$ConcatArraySubscriber:209
onComplete N reactor.core.publisher.FluxPeek$PeekSubscriber:260
onComplete N reactor.core.publisher.Operators$MultiSubscriptionSubscriber:2230
onComplete N reactor.core.publisher.FluxMapFuseable$MapFuseableSubscriber:152
onComplete N reactor.core.publisher.FluxUsing$UsingSubscriber:236
onComplete N reactor.core.publisher.FluxSwitchIfEmpty$SwitchIfEmptySubscriber:85
complete N reactor.core.publisher.FluxCreate$BaseSink:465
drain N reactor.core.publisher.FluxCreate$BufferAsyncSink:871
complete N reactor.core.publisher.FluxCreate$BufferAsyncSink:819
drainLoop N reactor.core.publisher.FluxCreate$SerializedFluxSink:249
drain N reactor.core.publisher.FluxCreate$SerializedFluxSink:215
complete N reactor.core.publisher.FluxCreate$SerializedFluxSink:206
apiCallComplete N com.linbit.linstor.netcom.TcpConnectorPeer:540
handleComplete N com.linbit.linstor.proto.CommonMessageProcessor:370
handleDataMessage N com.linbit.linstor.proto.CommonMessageProcessor:300
doProcessInOrderMessage N com.linbit.linstor.proto.CommonMessageProcessor:245
lambda$doProcessMessage$4 N com.linbit.linstor.proto.CommonMessageProcessor:230
subscribe N reactor.core.publisher.FluxDefer:46
subscribe N reactor.core.publisher.Flux:8848
onNext N reactor.core.publisher.FluxFlatMap$FlatMapMain:430
drainAsync N reactor.core.publisher.FluxFlattenIterable$FlattenIterableSubscriber:453
drain N reactor.core.publisher.FluxFlattenIterable$FlattenIterableSubscriber:724
onNext N reactor.core.publisher.FluxFlattenIterable$FlattenIterableSubscriber:256
drainFused N reactor.core.publisher.SinkManyUnicast:321
drain N reactor.core.publisher.SinkManyUnicast:363
tryEmitNext N reactor.core.publisher.SinkManyUnicast:239
tryEmitNext N reactor.core.publisher.SinkManySerialized:100
processInOrder N com.linbit.linstor.netcom.TcpConnectorPeer:446
doProcessMessage N com.linbit.linstor.proto.CommonMessageProcessor:228
lambda$processMessage$2 N com.linbit.linstor.proto.CommonMessageProcessor:165
onNext N reactor.core.publisher.FluxPeek$PeekSubscriber:185
runAsync N reactor.core.publisher.FluxPublishOn$PublishOnSubscriber:446
run N reactor.core.publisher.FluxPublishOn$PublishOnSubscriber:533
call N reactor.core.scheduler.WorkerTask:84
call N reactor.core.scheduler.WorkerTask:37
run N java.util.concurrent.FutureTask:317
run N java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask:304
runWorker N java.util.concurrent.ThreadPoolExecutor:1144
run N java.util.concurrent.ThreadPoolExecutor$Worker:642
run N java.lang.Thread:1583
END OF ERROR REPORT.