[DRBD-user] Proxmox & DRBD9 GUI "drbdstorage size"

Tsirkas Georgios geortsir3 at gmail.com
Thu May 25 13:56:12 CEST 2017

Note: "permalinks" may not be as permanent as we would like,
direct links of old sources may well be a few messages off.


Hello,

I have created a drbd9 and proxmox cluster. I have 2 nodes in which have 
made a volume group "drbdpool" 50GB and a thin volume drbdthinpool 13GB

/NAME                                 MAJ:MIN RM  SIZE RO TYPE MOUNTPOINT//
//sda                                    8:0    0   20G  0 disk //
//├─sda1                                 8:1    0    1M  0 part //
//├─sda2                                 8:2    0  256M  0 part //
//└─sda3                                 8:3    0 19.8G  0 part //
//  ├─pve-root                         251:0    0  4.8G  0 lvm  ///
//  ├─pve-swap                         251:1    0  2.4G  0 lvm  [SWAP]//
//  ├─pve-data_tmeta                   251:2    0   12M  0 lvm //
//  │ └─pve-data                       251:4    0 10.3G  0 lvm //
//  └─pve-data_tdata                   251:3    0 10.3G  0 lvm //
//    └─pve-data                       251:4    0 10.3G  0 lvm //
//sdb                                    8:16   0   50G  0 disk //
//└─sdb1                                 8:17   0   50G  0 part //
//  ├─drbdpool-.drbdctrl_0             251:5    0    4M  0 lvm //
//  │ └─drbd0                          147:0    0    4M  0 disk //
//  ├─drbdpool-.drbdctrl_1             251:6    0    4M  0 lvm //
//  │ └─drbd1                          147:1    0    4M  0 disk //
//  ├─drbdpool-drbdthinpool_tmeta      251:7    0   12M  0 lvm //
//  │ └─drbdpool-drbdthinpool-tpool    251:9    0   13G  0 lvm //
//  │   ├─drbdpool-drbdthinpool        251:10   0   10G  0 lvm //
//  │   └─drbdpool-vm--100--disk--1_00 251:11   0    9G  0 lvm //
//  │     └─drbd100                    147:100  0    9G  1 disk //
//  └─drbdpool-drbdthinpool_tdata      251:8    0   13G  0 lvm //
//    └─drbdpool-drbdthinpool-tpool    251:9    0   13G  0 lvm //
//      ├─drbdpool-drbdthinpool        251:10   0   10G  0 lvm //
//      └─drbdpool-vm--100--disk--1_00 251:11   0    9G  0 lvm //
//        └─drbd100                    147:100  0    9G  1 disk //
//sr0                                   11:0    1 1024M  0 rom //
//  VG       #PV #LV #SN Attr   VSize  VFree //
//  drbdpool   1   4   0 wz--n- 50.00g 36.96g//
//  pve        1   3   0 wz--n- 19.75g  2.35g//
//  LV               VG       Attr       LSize  Pool Origin Data%  
Meta%  Move Log Cpy%Sync Convert//
//  .drbdctrl_0      drbdpool -wi-ao---- 4.00m //
//  .drbdctrl_1      drbdpool -wi-ao---- 4.00m //
//  drbdthinpool     drbdpool twi-aotz-- 13.00g                     
17.21 9.67 //
//  vm-100-disk-1_00 drbdpool Vwi-aotz--  9.01g drbdthinpool        24.83 //
//  data             pve      twi-a-tz-- 10.25g                     0.00 
0.62 //
//  root             pve      -wi-ao---- 4.75g //
//  swap             pve      -wi-ao---- 2.38g //
//  0:.drbdctrl/0      Connected(2*) Primar/Second UpToDate(2*) //
//  1:.drbdctrl/1      Connected(2*) Primar/Second UpToDate(2*) //
//100:vm-100-disk-1/0  Connected(2*) Second/Primar UpToDate(2*) /

In this picture, proxmox seems to display these data 
http://i.imgur.com/Zf93WUX.png
Is it right or there is a calculated wrong?


Thank you!

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.linbit.com/pipermail/drbd-user/attachments/20170525/8d210e9c/attachment.htm>


More information about the drbd-user mailing list