Note: "permalinks" may not be as permanent as we would like,
direct links of old sources may well be a few messages off.
On 02/14/2011 12:59 PM, Dennis Jacobfeuerborn wrote: > I'm trying to wrap my head around storage setups that might work for > virtualization and I wonder if people here have experience with > creating a drbd setup for this purpose. > > What I am currently planning to implement is this: > 2 Storage 8-bay nodes with 8gb RAM and a dual-core Xeon processor. > Each system with gets equipped with 8 1TB SATA drives in a raid-5 > configuration. > Networking will either be two dual-port cards or two quad-core cards > which I plan to setup as bonded interfaces (balance-xor). > > I'm particularly worried about the networking side being a bottleneck > for the setup. I was looking into 10gbit and infiniband equipment but > they drive the cost up quite a bit and I'm not sure if they are > necessary if I can bond several 1gbit interfaces. > > Any thoughts? performance over bonded gig-e links has been talked about a few times in the past. seems like there is the discussion brought up regularly. here are links to the beginnings of 2 threads. http://lists.linbit.com/pipermail/drbd-user/2010-May/014113.html http://lists.linbit.com/pipermail/drbd-user/2010-September/014848.html there are several more to read through. the basics i got from glancing threads like these in the past is that bonding does okay for 2 interfaces but there isn't huge gains when going to 4 interfaces. also, considering that a 4 port ethernet card is gonna cost about 200 on the cheap end and can go much higher, using a older infiniband card or a 10gig-e card can make sense. there has been talk of inifiniband cards that can do 10gbps for under $200 on the list but i haven't actually done it myself. maybe someone else can chime in on that. i've also seen 10gig-e gear for under $500. hope that provides some help. mike