NSX Home Lab Hardware Setup

Hi Everyone, I know it has been a long time since I have posted (has it really been 6 months?!).

I promise I will do my best to post on a more regular basis, especially since I have good hardware to play around with now.

A little update before we get started: I am loving my new role as a Systems Engineer for NSX at VMware and couldn’t be happier where I am at. Over the last 6 months I have had to transition from a deeply technical, daily grind position in tech support, to a more customer focused, business solution position. It has been a blast solving customer’s pain points around networking and security with our NSX platform; there is no other company who can do what we do with the NSX platform ☺ Anyway, let’s get to the point of this post.. my home lab buildout for NSX!

Over the last several months I was researching various hardware to use for my ESXi hosts. I looked at the Dell C6100, HP Proliant, Dell PowerEdge, and the Dell VRTX. After extensive research I finally settled on the following setup.




2x Dell Powerdge R710

  • 2x Intel Xeon Quad Core E5540 2.53GHz
  • 64GB (8x8GB) 1333MHz 240-Pin ECC Memory DIMMs
  • Onboard Quad Gigabit 1000 Pro – 4 Ports Total
  • 2x1TB 7.2K SATA Hard Drive



1x Drobo 5n NAS

  • 2x 3TB WD Red
  • Crucial MX200 250GB mSATA Internal Solid State Drive (For SSD caching)



1x Alcatel L3 switch OS6855-P14

  • 12 10/100/1000 ports 4 PoE and 2 SFP uplinks
  • AOS

The main reason that I went with the Dell R710s was due to the supportability of ESXi 6.0, the quad port nics, and it doesn’t hurt that they are quiet, in fact they aren’t much louder than my desktop. I opted out of the HP servers, for no particular reason, and the C6100 because I have heard that thing is so loud you can hear it a couple rooms down. The drobo NAS was a recommendation of my colleague. I opted to try it out over Synology or Qnap based on that recommendation and price as I wanted SSD caching and it seemed to add around $300 to the synology setup over Drobo. This thing is pretty slick as you can mix or match the drives and even upgrade an individual drive on the fly. As far as the Alcatel switch, one of my colleagues, Tom Rumland, provided this L3 switch for free a six pack of beer ☺ how could I pass that up?

Design Considerations

I decided not to nest my ESXi hosts on the R710s and instead use them as bare metal hypervisors. I went this route as I wanted to have good performance, wanted a more real world lab, and because of various VLAN considerations when nesting. As you can see, I only have one cluster that is a combination of Management, Compute, and Edge workloads, therefore both will be prepared for DFW and VXLAN.

Since I had quad port nics, I had enough flexibility to split out my traffic into 4 VLANs – Management, vMotion, storage and VXLAN. Each has their own vmnic, and the others are being used for standby redundancy as I ended up trunking all of the interfaces.
As far as the toplogy, I have started out very basic. I am running one Edge device and one DLR with OSPF. I am going to write more posts on configuring ECMP, multitenancy, and multisite/cross vCenter NSX, and even DR toplogies in the future but wanted to keep it simple for now.
The finished product is below! I know its not pretty right now, I still need to order a rack solution and haven’t decided what I want. So far everything seems to be running awesome! I have a ton of available RAM to use for future growth which I am not typically used to since I was always running on a desktop, or an internal cloud environment. If anyone has any questions about my setup, wants more details, or needs assistance setting up their own lab please feel free to comment below!

Posted by:

Sean Whitney


  1. Heath Cejka -  August 7, 2016 - 9:51 pm 512

    Hey Sean cool post I’m really looking forward to your future post and congrats again with the new position! Just wondering where did you purchase most of the gear Amazon?

    • Sean Whitney -  August 8, 2016 - 7:25 am 513

      Hi Heath, thank you very much! I ended up buying the servers off of ebay, the drobo off newegg, and the remaining stuff like msata SSD, power cords, ethernet, etc off Amazon. Amazon and newegg seem to have the same prices on most of the stuff.

  2. Stephen Arogbonlo -  September 12, 2016 - 10:45 pm 518

    I have a dell precision T7500 workstation with esxi 6 installed as bare metal.
    Below is my desired setup.
    Compute cluster A: 2 esxi hosts
    Compute cluster B: 2 esxi hosts
    Compute cluster C 2 esxi hosts
    Edge-mgmt cluster : 3 esxi hosts

    Compute cluster A subnet: –
    Compute cluster B subnet: –
    Compute cluster C subnet: –
    Edge-Mgmt cluster subnet: -
    management subnet (Esxi , vcenter):

    When I setup my environment as above with 3 different subnets, I was unable to make esxi, vcenter and esxi hosts communicate with each other. I tried using cisco 1841 router and 3560 with vlan 100 and 200, my environment still did not work. Vswitch0 didnt work as I read in books.

    Could you kindly assist? I am trying to master vmware NSX for subsequent integration with Nuage and openstack.

    Kindly assist.
    I would also appreciate details of how you setup your vswitches( VSS and VDS) and also a copy of your ALU omniswitch configuration.

    Awaiting your timely response.

  3. Stephen Arogbonlo -  September 12, 2016 - 11:26 pm 520

    Hello Sean,

    Please post a low level design of your setup. I need to understand the details to enable me replicate on my home lab.

  4. Justin Gardner -  November 7, 2016 - 11:45 am 537

    Thanks for the home lab posting! I am also an SE at VWM as well (you can find me on Socialcast) and am, finally, looking to build a dedicated home lab. I noticed that you are running a VM labeled “Plex”. Is that really your media server and, if so, how is that working out for you?


Leave A Comment

Your email address will not be published. Required fields are marked (required):

Back to Top