Thoughts about home labs and combine AMD Threadripper with VMware

Afbeeldingsresultaat voor threadripper logo In the last couple of weeks I have been busy with building a new home lab environment. In this blog I wanted to go over what kind of hardware and software I chose for the lab en why. And why you should consider building your own lab. I hope this blog will help you with your home lab if you are looking for a new environment.

First off, why is it important to have a home lab?

Well, for me it helps with writing articles. If I write about a new technology I want to have tested it on my lab environment. Also, I create a lot of scripts and scripting in a customer’s production environment can be difficult. And if you use your own AD for scripts you kind of know what the result must be of some scripts. But when I first created my very first home lab it was just because I wanted to do everything myself. Back then I worked as a Citrix administrator and someone else had already setup the AD, DNS, DHCP, File server, SQL and complete Citrix environment. So doing this myself for the first time in my own lab and using Microsoft and Citrix course material, really helped me understanding the environment and as a bonus helped me get certified. This last reason is why I think that everyone who is an IT administrator should have their own lab. Of course a lab these days can be a high spec laptop or Azure credits. You don’t need a complete server in your home anymore!

Why I chose to buy new hardware for a lab and not use Azure?

For me the lab is something that runs 24/7. I use the VDI system on the lab as my own workspace. The DHCP and DNS servers also provide the DNS and DHCP to my complete house, wifi etc. So I did a calculation on running 24/7 on Azure and the budget I had for the new lab, $2000, would have been spent on Azure in 6 months with a minimal environment. So I decided to buy hardware again.

The hardware, and why this hardware?

So the hardware I bought is AMD workstation hardware and it’s the following:

AMD Ryzen Threadripper 1920x (12c/24t 3,5Ghz) CPU

The CPU I bought is from AMD. This is the first time I bought an AMD CPU for my lab environment. (my previous labs where on Intel Xeon). So why AMD? Well, Threadripper delivers an incredible amount of cores (up to 32) and thread (up to 64) and it does this at a high clock speed (3Ghz +). So after reading on forums that VMware vSphere works with Ryzen and Threadrippers CPU’s from version 6.5 u1, I leaned towards it. What pulled me over the line was the fact the AMD just launched their second gen Threadripper CPU’s on the market. This slashed the prices of their first gen CPU’s in half. So I bought the first gen 1920x with 12 cores and 24 threads at 3,5 Ghz. My old lab environment had 2 x 6 core Xeon (24 threads total) at 2,2 Ghz. So, this is a nice speed bump. It also gives me a great upgrade path because the second generation Threadrippers use the same socket as the first gen and are supported by the motherboard. If needed I can upgrade to the 2990wx an 32 core 64 thread 3Ghz clocked beast!

ASrock Taichi x399 motherboard

I choose the ASrock Taichi x399 for a few reasons. The first being it’s the most workstation like motherboard in the x399 series. Most motherboards are focused more on gaming. The x399 comes with two intel NIC’s which is great. It also has 8 SATA ports, 3 m.2 ports and a build in RAID controller. Another reason was that I found out on forums here (Overclockers.uk) that using this motherboard in combination with vSphere 6.7 u1 works without adding any drivers to vSphere. So this board is plug and play for virtualization.

Vengeance LPX 2 x 64 GB DDR4 Kit 2666mhz 8 x 16GB 128GB total RAM

For the ram I chose the vengeance LPX kit because this memory is competitive priced and is half the height which gives enough clearance for the CPU cooler. I went with 8 stick of 16 GB (2 QUAD Channel kits). This gives me 128 GB RAM, this is the maximum supported RAM for the motherboard. That limit is important to keep in mind.

Geforce GT710

Because AMD Threadripper doesn’t come with a built in GPU it made me choose a graphics card. For this I chose the cheapest one available at the moment, the Geforce GT710.

Samsung EVO 970 1TB Plus

NVMe drives are great for using with virtualization because it’s super fast and the 970 Evo plus is one of the fastest PIC-E 3.0 drives out there with more than 3000MB\s throughput. I made the choice to buy just one drive. I’m going to use it for caching disks and Citrix MCS. For fileserver etc. I will be using my Synology NAS (with redundancy). If you do need redundancy buy two drives.

Coolermaster v750 Gold PSU

The coolermaster v750 gold power supply is a solid, well tested power supply which is compatible with Threadripper because it has an 8 pin CPU power lead. The 750 watts is plenty capacity for the system.

Coolermaster H500 case

The coolermaster H500 case is a big and roomy case with lots of room for cable management. There are also two big 200MM fans in the front and a 120MM in the back. This gives it enough airflow. But more importantly, all the fans are behind either a dust filter or fine mesh which stops the dust. Because the lab runs 24/7 you don’t want it to clog up with dust.

Coolermaster Wraith Ripper cooler

The AMD Threadripper CPU doesn’t come with a CPU cooler. Luckily there are a few options to choose from. My choice is the Wraith Ripper, for three reasons. One, it is an air cooler, generally in workstation you want to skip the watercoolers because they are not as reliable as the air coolers over longer time. Two, it’s developed together with AMD and is the official cooler for the 2nd gen Threadripper’s. And three, it has the most easy installation of all the coolers. You just need to thread four screws.

So that is the complete hardware list and my reasoning behind this setup. The total price was 1850 Euro’s so around 2000 dollars and that was my budget. Here you see the total system running:

And a nice compliment on the setup by AMD.

The software

Well, the software was a lot easier than selecting all the hardware. As a VMware vExpert and a fan of PowerCLI, I of course chose for vSphere (ESXi) 6.7 u2 with vCenter 6.7. Both worked perfectly out of the box with the system. I didn’t need to add any extra drivers. I created an installation stick with Rufus and installed vSphere on an extra USB plugged into the back. As soon as the system was installed and rebooted I saw both the network adapters and the NVMe drives and SATA ports etc.. Mounting my Synology VM lun to the new home lab was easy and transferring the VM’s was done in a day. I did take extra time to create new domain controllers and file servers. I went from 2012R2 to 2019 core. And I also decided to completely create a new Citrix environment and start using MCS instead of PVS.

Power consumption

Power is not cheap and in the Netherlands where I live the government has heavy taxes on power. So leaving a system on 24/7 can be expensive. So how is the power consumption of the lab? Well, the system with all my VM’s turned on uses around 140 Watts of power. Together with my Synology and Sophos UTM firewall the complete system uses around 180 Watts. This seems a lot but my last lab environment used around 800 watts. This means a saving for me of 620 Watts and the old hardware gave off a lot of heat which I cooled with fans costing an extra 100 watts. So the saving in the summer is 720 Watts. I can’t wait for my next power bill, I will be smiling 😊

Why not just buy old server hardware ?

Another way to go with home labs is buy old server hardware. If you want 24 Threads and 128 GB of RAM and you buy a second hand DL380 G8, you only have to spend around 1000 Euro’s. Or maybe you can pick up an old server from your work for cheap. But as someone who had server hardware in his house for almost 10 years, I cannot recommend it. Server hardware is stable and runs really well 24/7, but it’s loud and as you can see under Power consumption, it needs a lot more power. Which can be really expensive long term. And the last thing is of course speed difference. You can buy a AMD Threadripper or Intel i9 with clock speeds over 3Ghz and NVMe support easily. But finding a cheap Xeon above 3,0 GHz is difficult. All that being sad, if you can get a server for really cheap and you only turn it on when testing or developing it can be a good option.

Alternatives

I have built a AMD Threadripper workstation system as vSphere server. But what are some good alternatives? Of course you can build the same kind of workstation with an Intel Core i9 or Xeon W CPU. But you could also check out the new Intel i7 NUC systems like the nuc8i7beh2. It’s around 500 dollars and it comes with a quad core 8 threads i7 at 2,7 Ghz. And it uses even less power. One big withdrawal with the NUC is RAM support. It can only handle 32 GB, which is way too less for a complete AD, DHCP,DNS, FS and Citrix system. But they are cheap, so you could buy multiple and use a Synology as shared storage between them. Another option are the super micro super servers like the E2-800D. It’s just a little bit bigger than the NUC and it’s also power efficient. But it comes with a 6 Core 12 Thread Xeon at 1,9 Ghz with support up to 128 GB RAM.

Do you have any good home lab tips ? Please leave them down in the comments.

I hope this was informative. For questions or comments you can always give a reaction in the comment section or contact me: