Jul 12 2011 Original Post: (see also updates, appended at the end)
Create your own fast and resilient VMware whitebox, for both "production" and "lab" use.
I'm happy to finally be publishing this summary on July 12, 2011...the day vSphere 5 was announced. Back in April, my goals for "vZilla" were straight forward, or so I thought. I naively thought the preparatory research would be simple. But I quickly realized I was pretty much alone in what I was attempting, and Googling Z68 AND ESX turned up proverbial crickets... So I began posting some thoughts over here, at one of my favorite sites, The Home Server Show Forums.
I set out to build a modern, Z68 motherboard based VMware ESXi system to run 24x7, one that is both very fast, and relatively efficient, given I plan to leave this box running 24x7. A roughly $2K budget was set, with the $8000 works of art/masterpieces seen here are admittedly far out of my range. I also wanted to avoid complex hackery required to get ESXi (4.1U1 or later) to install, with mass storage devices supported for storage pools, either natively (best), or via VMDirectPath / VT-d feature (VM assigned a particular PCI device). And this server could also become a full-time NAS for myself as well, perhaps using storage appliances with NFS or iSCSI. This server would give me much higher capacity at a more reasonable price than most NAS I've looked at, but this home-brew could have a lot more utility and speed. Ideally, I also wanted Intel RST (Rapid Storage Technology - SSD caching) to work, in passthru mode, knowing VMware HCL doesn't support any newer Intel ICH RAID stuff, and RST is actually software caching by the host OS, even less likely to get VMware support (no software RAID solutions supported by ESX/ESXi that I know of). I also thought that if passthru mode didn't work out, given how complex the mobo's built in RST is, I'm happy with at least seeing Z68's Marvell and/or Intel AHCI attached drives to be usable by VMware ESXi for the storage pool, perhaps using drive pooling of some variety some day. But for the primary storage pool, I required a modern RAID controller, one that would allows >2TB drives, support dynamic volume growth to allow me flexibility with online upgrades in the future, and have SSD caching for the potential for read speed enhancements.
With plans to also duplicate key data offsite automatically, my shopping emphasis was a bit more toward raw speed, having been bitten with slow, low-end NAS devices before. For my fastest storage needs, I'll go with an SSD anyway, for my planned 25 user VM running "Windows Storage Server 2008 R2 Essentials," for example, but with the backup pool using the redundant/resilient RAID array. Well, the good news is, that my vision has become reality. But not without a great deal of heartache (a bad motherboard), pain (bricked RAID adapter, that I later resurrected), and suffering (for a week, I wasn't sure I'd get any combination of mobo/RAID working well, having encountered numerous random issues with nasty boot errors). So, after 4 motherboards, 2 RAID controllers, and numerous drive configurations, I've settled on the combination RAID controller/mobo that works, and works well. I went through trying ASUS, MSI, Gigabyte, and finally, ASRock motherboards, not because I like restocking fees, but because the only board that really worked with the LSI controller, and VMDirectPath, also turned out be the last board I tried. I get to use all special VMware features I wanted that normally only Xeon based server motherboards handle, like VMDirectPath (VT-d), which may allow me to pass USB 3.0 devices straight through to my storage server virtual machine, could be handy for rapid copy to external USB3 enclosure for offsite encrypted storage. And with vSphere, I'll get nested VMs, potentially even handling Hyper-V learning/testing! Drum roll.. Yes, the secret sauce for this "vZilla" beast of a personal pet project (aka "VMwareZilla" or "vSphereZilla"), to run VMware ESXi with a Home Server backup VM left running full-time, is the ASRock Z68...
Here's the major components:
USB 3 for passthru
7 2TB drives (I already owned)
1 Corsair Force Series GT CSSD-F120GBGT-BK 2.5" 120GB SATA III Internal SSD
Yes, this big case has 10 internal 3.5" drive bays with trays. The SSD for hardware RAID caching was chosen for speed, price, and built-in automatic garbage collection, since RAID and ESXi don't support TRIM. So, now do I have your attention? I plan to beef up this new site with more build details, photos, RAID level test matrix spreadsheets, and a complete parts lists in the very near future, please come back soon. The long saga of how it all went down these past 2 months is quite drawn out, but will lead to lots of content, and I learned a LOT along the way that I look forward to sharing, including how important the 8GB BIOS memory is (for RAID adapters to work with BIOS, since they're not UEFI yet), the importance of testing RAID levels and tweaking the settings (dramatically affecting throughput), and the importance of researching UPS types that work with the new efficient power supplies. And now that VMware has announced that vSphere 5 will have >2TB volume support, I'm now well equipped to create a large RAID array pool up front, one that I don't have to split into individual small 2TB RAID arrays, so VMware can see the storage properly, ready to be divvied up among many VMs I'm planning, starting with my PC backups by Windows Home Server 2011/Windows Storage Server 2008 R2 Essentials. More fun yet to come!
Here's a new Fractal Design USB 3.0 upgrade kit to upgrade the built in ports on this tower case, nice!
And a really detailed Fractal Design XL review with many photos.
Here's how the install process looks.
Here's a more comprehensive parts list, hoping to get a better/prettier table built soon (any suggestions for tools to make that easier?)
Italics indicates future planned purchase
Aug 25 2011 Update: Now running ESXi 5.0!
Motherboard chosen: ASRock Fatal1ty Z68 Professional Gen3
(eventual "Ivy Bridge" compatibility, PCI Express 3.0, VMDirectPath support, and lots of SATA ports)
Oct 01 2011 Update: Originally titled:
"vZilla is born!" Z68/Core i7/LSI9265-8i running VMware ESXi
This is a whitebox/all-in-one lab box, a virtualization 24×7 lab really, but does "production" backup duties at night. It meets all my system backups, and my VMware ESXi and Microsoft Hyper-V self-training needs. It inspired me to create this blog, to share things I've learned along the long and winding road to getting this thing built.
Nov 14 2011 Update: Finalized parts list completed, planning a vZilla grows up blog post soon! I need to also make a new set of photos, to reflect the slight changes in hardware, mostly pertaining to the drive bays at the front which don't really affect the system build. Scroll up to see the detailed build list.
Dec 11 2011 Update: I should probably note that the below table doesn’t indicate the price of any 3.5″ drives for actual storage, which I already owned, re-used from previous projects. Otherwise, this list is a complete parts list, to build an identical or similar machine, should you wish to do so.
Jan 09 2012 Update: vZilla completed!
Feb 09 2012 Update: DIMM change
4 x 4GB = 16GB of Corsair DIMMs eren't on ASRock's memory table
4 x 8GB = 32GB of G.SKILL memory added, more appropriate for ESXi, currently $269 USD , on G.SKILL's site listing for compatiblity with ASRock here, for sale on newegg.com here, and discussed at length here:
Aug 20 2012 Update:
Still going strong well over a year later, very pleased with the overall experience of owning this mobo/CPU combo, heavily abusing this system with ESXi 5.0 (now with Windows 8 backups and Windows Server 2012 betas). No compelling reason to try to jump to Ivybridge (too much cost for too little gain). See also:
With 10,000 unique visitors to this particular blog post, and over 24,000 page views, TinkerTry.com/vzilla is among my most popular posts ever. Daily views are actually growing even now, after well over a year. And >95% of the system components are still available.
Jul 24 2014 Update:
This is system is still serving my needs very well. There is still nothing affordable and new out there that'll allow >32GB of RAM, so I'll just stick with this system as-is, for a while yet.