After abrupt server shutdown drbd shows no resources and linstor sees no physical-storage. Is there a way to recover from this state?
It’s tough to say without any additional information. Anything in the logs?
If LINSTOR doesn’t see the physical storage, my first guess would there is likely something wrong with the physical storage.
Thanks for your reply Devin,
Just for context I’ve seen this happening on two separate occasions. Once on a system without UPS during power outage. Second time on a system being shutdown ungracefully from a lights out manager.
Before the abrupt shutdown the lsblk showed something similar to this output:
├─ubuntu--vg-drbdlv01_tmeta 253:8 0 76M 0 lvm
│ └─ubuntu--vg-drbdlv01-tpool 253:10 0 300G 0 lvm
│ ├─ubuntu--vg-drbdlv01 253:11 0 300G 1 lvm
│ ├─ubuntu--vg-pvc--c2f11b76--c176--4ec9--b0b7--acd78073067d_00000
│ │ 253:12 0 8G 0 lvm
│ │ └─drbd1000 147:1000 0 8G 0 disk /var/snap/microk8s/common/var/lib/kubelet/pods/762387fe-fd0d-4c43-9246-0b6b300b6f5d/volumes/kubernetes.io~csi/pvc-c2f11b76-c176-4ec9-b0b7-acd78073
│ ├─ubuntu--vg-pvc--8069500a--7c8c--45aa--8d4e--3ab04e9a94a2_00000
│ │ 253:13 0 8G 0 lvm
│ │ └─drbd1001 147:1001 0 8G 0 disk
│ ├─ubuntu--vg-pvc--d7b2a928--c3a0--446c--a048--adabf1cecdbb_00000
│ │ 253:14 0 2G 0 lvm
│ │ └─drbd1006 147:1006 0 2G 0 disk /var/snap/microk8s/common/var/lib/kubelet/pods/3e662704-4dbb-4fca-a508-c4d7e1c428d5/volumes/kubernetes.io~csi/pvc-d7b2a928-c3a0-446c-a048-adabf1ce
│ └─ubuntu--vg-pvc--b6d0b34b--9887--4792--a1c1--b9dc0169af49_00000
│ 253:15 0 20G 0 lvm
│ └─drbd1003 147:1003 0 20G 0 disk /var/snap/microk8s/common/var/lib/kubelet/pods/09997923-e64f-4e57-811d-d0f19b3f1ab2/volumes/kubernetes.io~csi/pvc-b6d0b34b-9887-4792-a1c1-b9dc0169
└─ubuntu--vg-drbdlv01_tdata 253:9 0 300G 0 lvm
└─ubuntu--vg-drbdlv01-tpool 253:10 0 300G 0 lvm
├─ubuntu--vg-drbdlv01 253:11 0 300G 1 lvm
├─ubuntu--vg-pvc--c2f11b76--c176--4ec9--b0b7--acd78073067d_00000
│ 253:12 0 8G 0 lvm
│ └─drbd1000 147:1000 0 8G 0 disk /var/snap/microk8s/common/var/lib/kubelet/pods/762387fe-fd0d-4c43-9246-0b6b300b6f5d/volumes/kubernetes.io~csi/pvc-c2f11b76-c176-4ec9-b0b7-acd78073
├─ubuntu--vg-pvc--8069500a--7c8c--45aa--8d4e--3ab04e9a94a2_00000
│ 253:13 0 8G 0 lvm
│ └─drbd1001 147:1001 0 8G 0 disk
├─ubuntu--vg-pvc--d7b2a928--c3a0--446c--a048--adabf1cecdbb_00000
│ 253:14 0 2G 0 lvm
│ └─drbd1006 147:1006 0 2G 0 disk /var/snap/microk8s/common/var/lib/kubelet/pods/3e662704-4dbb-4fca-a508-c4d7e1c428d5/volumes/kubernetes.io~csi/pvc-d7b2a928-c3a0-446c-a048-adabf1ce
└─ubuntu--vg-pvc--b6d0b34b--9887--4792--a1c1--b9dc0169af49_00000
253:15 0 20G 0 lvm
└─drbd1003 147:1003 0 20G 0 disk /var/snap/microk8s/common/var/lib/kubelet/pods/09997923-e64f-4e57-811d-d0f19b3f1ab2/volumes/kubernetes.io~csi/pvc-b6d0b34b-9887-4792-a1c1-b9dc0169
After the system was powered up the lsblk does not show the drbdXXXX disks any longer
│ ├─ubuntu--vg-ubuntu--lv 253:0 0 300G 0 lvm /
│ ├─ubuntu--vg-drbdlv01_tmeta 253:1 0 344M 0 lvm
│ │ └─ubuntu--vg-drbdlv01-tpool 253:3 0 1.3T 0 lvm
│ │ ├─ubuntu--vg-drbdlv01 253:4 0 1.3T 1 lvm
│ │ ├─ubuntu--vg-pvc--48402671--3490--4204--a9db--b1ce1e9e28f3_00000 253:5 0 8G 0 lvm
│ │ ├─ubuntu--vg-pvc--4eb97c36--0db2--4ec9--8cbb--c859d6ca61c1_00000 253:6 0 8G 0 lvm
│ │ ├─ubuntu--vg-pvc--1e8ea97e--7bcc--4ce9--ad20--a9bb1b73ac33_00000 253:7 0 2G 0 lvm
│ │ ├─ubuntu--vg-pvc--6b64c702--2499--4dd1--9be0--76de0dc34f40_00000 253:8 0 8G 0 lvm
│ │ ├─ubuntu--vg-pvc--561b15b2--cf1d--443e--8ad5--82ac39bda936_00000 253:9 0 8G 0 lvm
│ │ ├─ubuntu--vg-pvc--da327e42--1c1f--49f0--b362--07688905577a_00000 253:10 0 8G 0 lvm
│ │ ├─ubuntu--vg-pvc--ee9675b0--36bd--47f1--a265--d4ba7fff1343_00000 253:11 0 8G 0 lvm
│ │ ├─ubuntu--vg-pvc--38d2ec2e--40c0--4397--99ce--5a99e3b01c59_00000 253:12 0 500.1G 0 lvm
│ │ ├─ubuntu--vg-pvc--25c2c3c6--c449--47d6--8005--e8a4a52d80b5_00000 253:13 0 8G 0 lvm
│ │ ├─ubuntu--vg-pvc--a1e4b63b--9a2a--472c--811c--3eb4e2426d9d_00000 253:14 0 500.1G 0 lvm
│ │ ├─ubuntu--vg-pvc--f481ad78--0202--4349--8861--0fdd466db032_00000 253:15 0 8G 0 lvm
│ │ ├─ubuntu--vg-pvc--d57e29df--3902--4e68--a519--43eafab2737e_00000 253:16 0 500.1G 0 lvm
│ │ ├─ubuntu--vg-pvc--27659e8e--e3eb--4230--83ef--43c0eeb63cc9_00000 253:17 0 8G 0 lvm
│ │ ├─ubuntu--vg-pvc--df6ae000--5cdf--47ff--b951--9aa606068f1a_00000 253:18 0 100G 0 lvm
│ │ ├─ubuntu--vg-pvc--ac0213df--0c64--4546--ac56--725bfef7533b_00000 253:19 0 2G 0 lvm
│ │ ├─ubuntu--vg-pvc--52490850--defe--4019--b594--5759c9d053aa_00000 253:20 0 2G 0 lvm
│ │ ├─ubuntu--vg-pvc--acd26b2a--ab3d--457f--9ef3--c0f23e86fccc_00000 253:21 0 2G 0 lvm
│ │ ├─ubuntu--vg-pvc--f1f727d4--514f--4529--b39a--762bafa8c5e1_00000 253:22 0 8G 0 lvm
│ │ ├─ubuntu--vg-pvc--52ca1165--750c--405e--a354--6c3c1ae57cbf_00000 253:23 0 8G 0 lvm
│ │ ├─ubuntu--vg-pvc--c339697b--6069--4acc--bcd1--91647d1eb29b_00000 253:24 0 8G 0 lvm
│ │ ├─ubuntu--vg-pvc--f8afc5f7--2a83--4ddd--bda6--8158c3f57674_00000 253:25 0 8G 0 lvm
│ │ └─ubuntu--vg-pvc--0502be2c--6377--47dd--824b--6898ecfc79b2_00000 253:26 0 8G 0 lvm
│ └─ubuntu--vg-drbdlv01_tdata 253:2 0 1.3T 0 lvm
│ └─ubuntu--vg-drbdlv01-tpool 253:3 0 1.3T 0 lvm
│ ├─ubuntu--vg-drbdlv01 253:4 0 1.3T 1 lvm
│ ├─ubuntu--vg-pvc--48402671--3490--4204--a9db--b1ce1e9e28f3_00000 253:5 0 8G 0 lvm
│ ├─ubuntu--vg-pvc--4eb97c36--0db2--4ec9--8cbb--c859d6ca61c1_00000 253:6 0 8G 0 lvm
│ ├─ubuntu--vg-pvc--1e8ea97e--7bcc--4ce9--ad20--a9bb1b73ac33_00000 253:7 0 2G 0 lvm
│ ├─ubuntu--vg-pvc--6b64c702--2499--4dd1--9be0--76de0dc34f40_00000 253:8 0 8G 0 lvm
│ ├─ubuntu--vg-pvc--561b15b2--cf1d--443e--8ad5--82ac39bda936_00000 253:9 0 8G 0 lvm
│ ├─ubuntu--vg-pvc--da327e42--1c1f--49f0--b362--07688905577a_00000 253:10 0 8G 0 lvm
│ ├─ubuntu--vg-pvc--ee9675b0--36bd--47f1--a265--d4ba7fff1343_00000 253:11 0 8G 0 lvm
│ ├─ubuntu--vg-pvc--38d2ec2e--40c0--4397--99ce--5a99e3b01c59_00000 253:12 0 500.1G 0 lvm
│ ├─ubuntu--vg-pvc--25c2c3c6--c449--47d6--8005--e8a4a52d80b5_00000 253:13 0 8G 0 lvm
│ ├─ubuntu--vg-pvc--a1e4b63b--9a2a--472c--811c--3eb4e2426d9d_00000 253:14 0 500.1G 0 lvm
│ ├─ubuntu--vg-pvc--f481ad78--0202--4349--8861--0fdd466db032_00000 253:15 0 8G 0 lvm
│ ├─ubuntu--vg-pvc--d57e29df--3902--4e68--a519--43eafab2737e_00000 253:16 0 500.1G 0 lvm
│ ├─ubuntu--vg-pvc--27659e8e--e3eb--4230--83ef--43c0eeb63cc9_00000 253:17 0 8G 0 lvm
│ ├─ubuntu--vg-pvc--df6ae000--5cdf--47ff--b951--9aa606068f1a_00000 253:18 0 100G 0 lvm
│ ├─ubuntu--vg-pvc--ac0213df--0c64--4546--ac56--725bfef7533b_00000 253:19 0 2G 0 lvm
│ ├─ubuntu--vg-pvc--52490850--defe--4019--b594--5759c9d053aa_00000 253:20 0 2G 0 lvm
│ ├─ubuntu--vg-pvc--acd26b2a--ab3d--457f--9ef3--c0f23e86fccc_00000 253:21 0 2G 0 lvm
│ ├─ubuntu--vg-pvc--f1f727d4--514f--4529--b39a--762bafa8c5e1_00000 253:22 0 8G 0 lvm
│ ├─ubuntu--vg-pvc--52ca1165--750c--405e--a354--6c3c1ae57cbf_00000 253:23 0 8G 0 lvm
│ ├─ubuntu--vg-pvc--c339697b--6069--4acc--bcd1--91647d1eb29b_00000 253:24 0 8G 0 lvm
│ ├─ubuntu--vg-pvc--f8afc5f7--2a83--4ddd--bda6--8158c3f57674_00000 253:25 0 8G 0 lvm
│ └─ubuntu--vg-pvc--0502be2c--6377--47dd--824b--6898ecfc79b2_00000 253:26 0 8G 0 lvm
└─sda4 8:4 0 138.4G 0 part
└─ubuntu--vg-drbdlv01_tdata 253:2 0 1.3T 0 lvm
└─ubuntu--vg-drbdlv01-tpool 253:3 0 1.3T 0 lvm
├─ubuntu--vg-drbdlv01 253:4 0 1.3T 1 lvm
├─ubuntu--vg-pvc--48402671--3490--4204--a9db--b1ce1e9e28f3_00000 253:5 0 8G 0 lvm
├─ubuntu--vg-pvc--4eb97c36--0db2--4ec9--8cbb--c859d6ca61c1_00000 253:6 0 8G 0 lvm
├─ubuntu--vg-pvc--1e8ea97e--7bcc--4ce9--ad20--a9bb1b73ac33_00000 253:7 0 2G 0 lvm
├─ubuntu--vg-pvc--6b64c702--2499--4dd1--9be0--76de0dc34f40_00000 253:8 0 8G 0 lvm
├─ubuntu--vg-pvc--561b15b2--cf1d--443e--8ad5--82ac39bda936_00000 253:9 0 8G 0 lvm
├─ubuntu--vg-pvc--da327e42--1c1f--49f0--b362--07688905577a_00000 253:10 0 8G 0 lvm
├─ubuntu--vg-pvc--ee9675b0--36bd--47f1--a265--d4ba7fff1343_00000 253:11 0 8G 0 lvm
├─ubuntu--vg-pvc--38d2ec2e--40c0--4397--99ce--5a99e3b01c59_00000 253:12 0 500.1G 0 lvm
├─ubuntu--vg-pvc--25c2c3c6--c449--47d6--8005--e8a4a52d80b5_00000 253:13 0 8G 0 lvm
├─ubuntu--vg-pvc--a1e4b63b--9a2a--472c--811c--3eb4e2426d9d_00000 253:14 0 500.1G 0 lvm
├─ubuntu--vg-pvc--f481ad78--0202--4349--8861--0fdd466db032_00000 253:15 0 8G 0 lvm
├─ubuntu--vg-pvc--d57e29df--3902--4e68--a519--43eafab2737e_00000 253:16 0 500.1G 0 lvm
├─ubuntu--vg-pvc--27659e8e--e3eb--4230--83ef--43c0eeb63cc9_00000 253:17 0 8G 0 lvm
├─ubuntu--vg-pvc--df6ae000--5cdf--47ff--b951--9aa606068f1a_00000 253:18 0 100G 0 lvm
├─ubuntu--vg-pvc--ac0213df--0c64--4546--ac56--725bfef7533b_00000 253:19 0 2G 0 lvm
├─ubuntu--vg-pvc--52490850--defe--4019--b594--5759c9d053aa_00000 253:20 0 2G 0 lvm
├─ubuntu--vg-pvc--acd26b2a--ab3d--457f--9ef3--c0f23e86fccc_00000 253:21 0 2G 0 lvm
├─ubuntu--vg-pvc--f1f727d4--514f--4529--b39a--762bafa8c5e1_00000 253:22 0 8G 0 lvm
├─ubuntu--vg-pvc--52ca1165--750c--405e--a354--6c3c1ae57cbf_00000 253:23 0 8G 0 lvm
├─ubuntu--vg-pvc--c339697b--6069--4acc--bcd1--91647d1eb29b_00000 253:24 0 8G 0 lvm
├─ubuntu--vg-pvc--f8afc5f7--2a83--4ddd--bda6--8158c3f57674_00000 253:25 0 8G 0 lvm
└─ubuntu--vg-pvc--0502be2c--6377--47dd--824b--6898ecfc79b2_00000 253:26 0 8G 0 lvm
Using the linstor command in the piraeus-cs-controller pod I can see the following:
linstor resoirce-definition list
╭───────────────────────────────────────────────────────────────────────────────────────────────────╮
┊ ResourceName ┊ Port ┊ ResourceGroup ┊ State ┊
╞═══════════════════════════════════════════════════════════════════════════════════════════════════╡
┊ pvc-1e8ea97e-7bcc-4ce9-ad20-a9bb1b73ac33 ┊ 7004 ┊ sc-b676293a-803a-5779-b7ae-ee4d5cbccbf9 ┊ ok ┊
┊ pvc-3c824d13-7cef-41aa-843f-179c98da108b ┊ 7027 ┊ sc-b676293a-803a-5779-b7ae-ee4d5cbccbf9 ┊ ok ┊
┊ pvc-4eb97c36-0db2-4ec9-8cbb-c859d6ca61c1 ┊ 7002 ┊ sc-b676293a-803a-5779-b7ae-ee4d5cbccbf9 ┊ ok ┊
┊ pvc-6b07fbdf-0486-4698-b0e0-1a0d148751b0 ┊ 7022 ┊ sc-b676293a-803a-5779-b7ae-ee4d5cbccbf9 ┊ ok ┊
┊ pvc-6b64c702-2499-4dd1-9be0-76de0dc34f40 ┊ 7007 ┊ sc-b676293a-803a-5779-b7ae-ee4d5cbccbf9 ┊ ok ┊
┊ pvc-8ffb5ad7-c945-4fb3-8e68-e7c6da77801d ┊ 7003 ┊ sc-b676293a-803a-5779-b7ae-ee4d5cbccbf9 ┊ ok ┊
┊ pvc-25c2c3c6-c449-47d6-8005-e8a4a52d80b5 ┊ 7010 ┊ sc-b676293a-803a-5779-b7ae-ee4d5cbccbf9 ┊ ok ┊
┊ pvc-83908dac-013d-4b5c-867b-d7b9ac7e0df5 ┊ 7024 ┊ sc-b676293a-803a-5779-b7ae-ee4d5cbccbf9 ┊ ok ┊
...
linstor resource list
╭───────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
┊ ResourceName ┊ Node ┊ Port ┊ Usage ┊ Conns ┊ State ┊ CreatedOn ┊
╞═══════════════════════════════════════════════════════════════════════════════════════════════════════════════╡
┊ pvc-1e8ea97e-7bcc-4ce9-ad20-a9bb1b73ac33 ┊ qqd-srv1 ┊ 7004 ┊ ┊ ┊ Unknown ┊ 2023-04-04 16:38:26 ┊
┊ pvc-1e8ea97e-7bcc-4ce9-ad20-a9bb1b73ac33 ┊ qqd-srv2 ┊ 7004 ┊ ┊ ┊ Unknown ┊ 2023-04-04 16:38:23 ┊
┊ pvc-1e8ea97e-7bcc-4ce9-ad20-a9bb1b73ac33 ┊ qqd-srv3 ┊ 7004 ┊ ┊ ┊ Unknown ┊ 2023-04-26 15:35:03 ┊
┊ pvc-3c824d13-7cef-41aa-843f-179c98da108b ┊ qqd-srv4 ┊ 7027 ┊ ┊ ┊ Unknown ┊ 2024-07-11 19:45:48 ┊
┊ pvc-3c824d13-7cef-41aa-843f-179c98da108b ┊ qqd-srv1 ┊ 7027 ┊ ┊ ┊ Unknown ┊ 2024-07-11 19:45:49 ┊
┊ pvc-3c824d13-7cef-41aa-843f-179c98da108b ┊ qqd-srv3 ┊ 7027 ┊ ┊ ┊ Unknown ┊ 2024-07-11 19:45:43 ┊
┊ pvc-4eb97c36-0db2-4ec9-8cbb-c859d6ca61c1 ┊ qqd-srv1 ┊ 7002 ┊ ┊ ┊ Unknown ┊ 2023-03-17 17:17:19 ┊
┊ pvc-4eb97c36-0db2-4ec9-8cbb-c859d6ca61c1 ┊ qqd-srv2 ┊ 7002 ┊ ┊ ┊ Unknown ┊ 2023-02-09 21:05:04 ┊
┊ pvc-83908dac-013d-4b5c-867b-d7b9ac7e0df5 ┊ qqd-srv4 ┊ 7024 ┊ ┊ ┊ Unknown ┊ 2024-07-11 19:45:45 ┊
┊ pvc-83908dac-013d-4b5c-867b-d7b9ac7e0df5 ┊ qqd-srv1 ┊ 7024 ┊ ┊ ┊ Unknown ┊ 2024-07-11 19:45:47 ┊
┊ pvc-83908dac-013d-4b5c-867b-d7b9ac7e0df5 ┊ qqd-srv3 ┊ 7024 ┊ ┊ ┊ Unknown ┊ 2024-07-11 19:45:42 ┊
...
There are two types of errors logged: StorageException and NullPointerException (shown below)
...
┊ 669049AF-F218A-046607 ┊ 2024-07-22 14:35:11 ┊ S|qqd-srv1 ┊ NullPointerException ┊
┊ 669049ED-D9B03-020656 ┊ 2024-07-22 14:35:12 ┊ S|qqd-srv4 ┊ NullPointerException ┊
┊ 669049AF-F218A-046608 ┊ 2024-07-22 14:35:12 ┊ S|qqd-srv1 ┊ NullPointerException ┊
┊ 66914BED-9F848-034603 ┊ 2024-07-22 14:35:12 ┊ S|qqd-srv3 ┊ NullPointerException ┊
┊ 66914BED-9F848-034604 ┊ 2024-07-22 14:35:42 ┊ S|qqd-srv3 ┊ StorageException: Device does not exist. VlmData: Node: 'qqd-srv3', Rsc: 'p... ┊
┊ 669049AF-F218A-046609 ┊ 2024-07-22 14:35:42 ┊ S|qqd-srv1 ┊ StorageException: Device does not exist. VlmData: Node: 'qqd-srv1', Rsc: 'p... ┊
┊ 66914BED-9F848-034605 ┊ 2024-07-22 14:35:42 ┊ S|qqd-srv3 ┊ StorageException: Device does not exist. VlmData: Node: 'qqd-srv3', Rsc: 'p... ┊
┊ 669049AB-203B0-009609 ┊ 2024-07-22 14:35:42 ┊ S|qqd-srv2 ┊ StorageException: Device does not exist. VlmData: Node: 'qqd-srv2', Rsc: 'p... ┊
...
This is the StorageException:
root@piraeus-cs-controller:/# l err s 66914BED-9F848-025559
ERROR REPORT 66914BED-9F848-025559
============================================================
Application: LINBIT�� LINSTOR
Module: Satellite
Version: 1.19.1
Build ID: a758bf07796c374fd2004465b0d8690209b74356
Build time: 2022-07-27T06:36:54+00:00
Error time: 2024-07-19 23:44:42
Node: qqd-srv3
============================================================
Reported error:
===============
Description:
Device does not exist.
Cause:
The volume could not be found on the system.
Category: LinStorException
Class name: StorageException
Class canonical name: com.linbit.linstor.storage.StorageException
Generated at: Method 'getAllocatedSize', Source file 'StltProviderUtils.java', Line #21
Error message: Device does not exist. VlmData: Node: 'qqd-srv3', Rsc: 'pvc-83908dac-013d-4b5c-867b-d7b9ac7e0df5', VlmNr: '0', suffix:
Call backtrace:
Method Native Class:Line number
getAllocatedSize N com.linbit.linstor.layer.storage.utils.StltProviderUtils:21
getAllocatedSize N com.linbit.linstor.layer.storage.AbsStorageProvider:1241
updateAllocatedSize N com.linbit.linstor.layer.storage.AbsStorageProvider:1513
getVlmAllocatedCapacities N com.linbit.linstor.core.apicallhandler.StltApiCallHandlerUtils:171
executeInScope N com.linbit.linstor.api.protobuf.ReqVlmAllocated:84
lambda$executeReactive$0 N com.linbit.linstor.api.protobuf.ReqVlmAllocated:69
doInScope N com.linbit.linstor.core.apicallhandler.ScopeRunner:150
lambda$fluxInScope$0 N com.linbit.linstor.core.apicallhandler.ScopeRunner:76
call N reactor.core.publisher.MonoCallable:91
trySubscribeScalarMap N reactor.core.publisher.FluxFlatMap:126
subscribeOrReturn N reactor.core.publisher.MonoFlatMapMany:49
subscribe N reactor.core.publisher.Flux:8343
onNext N reactor.core.publisher.MonoFlatMapMany$FlatMapManyMain:188
request N reactor.core.publisher.Operators$ScalarSubscription:2344
onSubscribe N reactor.core.publisher.MonoFlatMapMany$FlatMapManyMain:134
subscribe N reactor.core.publisher.MonoCurrentContext:35
subscribe N reactor.core.publisher.Flux:8357
trySubscribeScalarMap N reactor.core.publisher.FluxFlatMap:199
subscribeOrReturn N reactor.core.publisher.MonoFlatMapMany:49
subscribe N reactor.core.publisher.Flux:8343
onNext N reactor.core.publisher.FluxFlatMap$FlatMapMain:418
slowPath N reactor.core.publisher.FluxArray$ArraySubscription:126
request N reactor.core.publisher.FluxArray$ArraySubscription:99
onSubscribe N reactor.core.publisher.FluxFlatMap$FlatMapMain:363
subscribe N reactor.core.publisher.FluxMerge:69
subscribe N reactor.core.publisher.Flux:8357
onComplete N reactor.core.publisher.FluxConcatArray$ConcatArraySubscriber:207
subscribe N reactor.core.publisher.FluxConcatArray:80
subscribe N reactor.core.publisher.InternalFluxOperator:62
subscribe N reactor.core.publisher.FluxDefer:54
subscribe N reactor.core.publisher.Flux:8357
onNext N reactor.core.publisher.FluxFlatMap$FlatMapMain:418
drainAsync N reactor.core.publisher.FluxFlattenIterable$FlattenIterableSubscriber:414
drain N reactor.core.publisher.FluxFlattenIterable$FlattenIterableSubscriber:679
onNext N reactor.core.publisher.FluxFlattenIterable$FlattenIterableSubscriber:243
drainFused N reactor.core.publisher.UnicastProcessor:286
drain N reactor.core.publisher.UnicastProcessor:329
onNext N reactor.core.publisher.UnicastProcessor:408
next N reactor.core.publisher.FluxCreate$IgnoreSink:618
next N reactor.core.publisher.FluxCreate$SerializedSink:153
processInOrder N com.linbit.linstor.netcom.TcpConnectorPeer:383
doProcessMessage N com.linbit.linstor.proto.CommonMessageProcessor:218
lambda$processMessage$2 N com.linbit.linstor.proto.CommonMessageProcessor:164
onNext N reactor.core.publisher.FluxPeek$PeekSubscriber:177
runAsync N reactor.core.publisher.FluxPublishOn$PublishOnSubscriber:439
run N reactor.core.publisher.FluxPublishOn$PublishOnSubscriber:526
call N reactor.core.scheduler.WorkerTask:84
call N reactor.core.scheduler.WorkerTask:37
run N java.util.concurrent.FutureTask:264
run N java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask:304
runWorker N java.util.concurrent.ThreadPoolExecutor:1128
run N java.util.concurrent.ThreadPoolExecutor$Worker:628
run N java.lang.Thread:829
END OF ERROR REPORT.
This is the NullPointer type of error. The pvc instance is different. It’s hard to find the corresponding one to the above error, but there is one similar to this.
root@piraeus-cs-controller:/# l err s 669049AF-F218A-035304
ERROR REPORT 669049AF-F218A-035304
============================================================
Application: LINBIT�� LINSTOR
Module: Satellite
Version: 1.19.1
Build ID: a758bf07796c374fd2004465b0d8690209b74356
Build time: 2022-07-27T06:36:54+00:00
Error time: 2024-07-19 23:45:50
Node: qqd-srv1
============================================================
Reported error:
===============
Category: RuntimeException
Class name: NullPointerException
Class canonical name: java.lang.NullPointerException
Generated at: <UNKNOWN>
Error context:
An error occurred while processing resource 'Node: 'qqd-srv1', Rsc: 'pvc-82e74425-2932-40c1-8100-6328b8e8e8c9''
Call backtrace:
Method Native Class:Line number
END OF ERROR REPORT.
Thanks again!
If only DRBD is not returning after a reboot, my first guess would be that the systems might have rebooted into a new kernel, which is incompatible with the installed DRBD kernel module.
Check to see if a DRBD module is already loaded and if it is a 9.x version. You can check this via cat /proc/drbd
If nothing is loaded try a modprobe drbd
and see if that returns any useful error.
This particular system was build quite awhile ago and its version is 8.4.11.
$ cat /proc/drbd
version: 8.4.11 (api:1/proto:86-101)
srcversion: C7B8F7076B8D6DB066D84D9
Is this issue known prior to 9.x?
LINSTOR requires DRBD v9 or later. It simply will not work with DRBD v8. You must have been running a newer version of DRBD before the reboot.
As I had mentioned, perhaps the reboot loaded to a new kernel? I would investigate and see if there are any drbd.ko
files present for any older installed kernels. To fix this you will need to boot to the previous kernel or install a new DRBD kernel module.
Thank you Devin!
I found a mixture of 8.x and 9.x modules in /usr/lib/modules/ under different kernel versions.
Appreciate you help!