Artificial Intelligence Computing Leadership from NVIDIA
Installing A GPU Card Into One Node Of A Cluster.
Wondering if this is a supported configuration and/or if anyone has done this in a VMware environment. I have a 4 node ESXi cluster and I actually want to install GPU cards into two of the nodes. Then set up rules to prevent GPU enabled VMs from vMotioning to the hosts that don't have GPU cards. Thoughts?
Wondering if this is a supported configuration and/or if anyone has done this in a VMware environment. I have a 4 node ESXi cluster and I actually want to install GPU cards into two of the nodes. Then set up rules to prevent GPU enabled VMs from vMotioning to the hosts that don't have GPU cards.

Thoughts?

#1
Posted 05/08/2019 09:05 PM   
I assume you're thinking of DRS, rather than vMotion. As to manually vMotion a VM you'd obviously specify where the VM was moving to first, as you'd know which Hosts have GPUs installed. Whereas DRS will vMotion automatically based on metrics and policies. DRS and vGPU do not work together currently ( [b]Section 6.28[/b] - https://docs.nvidia.com/grid/latest/grid-vgpu-release-notes-vmware-vsphere/index.html ) however when they do it will be awesome! Regards Ben
I assume you're thinking of DRS, rather than vMotion. As to manually vMotion a VM you'd obviously specify where the VM was moving to first, as you'd know which Hosts have GPUs installed. Whereas DRS will vMotion automatically based on metrics and policies.

DRS and vGPU do not work together currently ( Section 6.28 - https://docs.nvidia.com/grid/latest/grid-vgpu-release-notes-vmware-vsphere/index.html ) however when they do it will be awesome!

Regards

Ben

#2
Posted 05/09/2019 08:55 AM   
Ben, Thanks for the reply! I'm more concerned about the configuration of two nodes with and two nodes without GPU cards in the same cluster. Does it pose problems? This is probably more of a VMware question but I can't find anything about it on the VMware site or forums. Rick.
Ben,

Thanks for the reply! I'm more concerned about the configuration of two nodes with and two nodes without GPU cards in the same cluster. Does it pose problems? This is probably more of a VMware question but I can't find anything about it on the VMware site or forums.

Rick.

#3
Posted 05/09/2019 03:12 PM   
Hi Rick Nope, it won't cause any problems at all. The only real difference is the NVIDIA .vib will be installed on those Hosts that have GPUs installed (assuming you're using vGPU and not Passthrough). I run various platforms that have different GPUs in different Hosts (P4s in some Hosts and V100s in others within the same Resource Pool), and I just keep those specific GPU enabled VMs running on those specific Hosts. As long as you have at least 2 Hosts that have the same hardware spec, you're covered for resilience as well. Regards Ben
Hi Rick

Nope, it won't cause any problems at all. The only real difference is the NVIDIA .vib will be installed on those Hosts that have GPUs installed (assuming you're using vGPU and not Passthrough). I run various platforms that have different GPUs in different Hosts (P4s in some Hosts and V100s in others within the same Resource Pool), and I just keep those specific GPU enabled VMs running on those specific Hosts. As long as you have at least 2 Hosts that have the same hardware spec, you're covered for resilience as well.

Regards

Ben

#4
Posted 05/09/2019 03:28 PM   
Excellent. That's what I'm looking for. I simply want to GPU enable a set of VM's that will be running ArcGIS on two of the four hosts. The rest of the VM's will not be GPU enabled. Make sense?
Excellent. That's what I'm looking for. I simply want to GPU enable a set of VM's that will be running ArcGIS on two of the four hosts. The rest of the VM's will not be GPU enabled. Make sense?

#5
Posted 05/09/2019 03:52 PM   
Absolutely, and that'll work without issue. Which GPU are you using? Regards Ben
Absolutely, and that'll work without issue.

Which GPU are you using?

Regards

Ben

#6
Posted 05/09/2019 05:39 PM   
Scroll To Top

Add Reply