NVIDIA
vDGA Grid with VMWare ESX (no Horizon View)
I have a Grid K1 that's sitting in a Dell R720. I have ESXi 5.5 installed and I'd like to configure pass through to the VMs. I've installed the .vib drivers I've configured the pass through in the vCenter webui I've attached the PCI device to the VMs. The VM can see the card and query the card, but display falls back to native Vmware driver. Are there any more steps? The only deployment instructions I see from nVidia all refer to Horizon View, but I just want to attach these to stand alone unbrokered VMs. Is it supported?
I have a Grid K1 that's sitting in a Dell R720.

I have ESXi 5.5 installed and I'd like to configure pass through to the VMs.
I've installed the .vib drivers
I've configured the pass through in the vCenter webui
I've attached the PCI device to the VMs.

The VM can see the card and query the card, but display falls back to native Vmware driver.

Are there any more steps? The only deployment instructions I see from nVidia all refer to Horizon View, but I just want to attach these to stand alone unbrokered VMs. Is it supported?

#1
Posted 05/27/2015 11:46 PM   
Hello notrobb, please follow the installation instructions at http://www.vmware.com/files/pdf/techpaper/vmware-horizon-view-graphics-acceleration-deployment.pdf and note that no vib driver is required for vDGA (GPU Passthrough). There is a vib driver for vSGA and vGPU but none is necessary for vDGA Let me know if that helps. Thanks, Erik Bohnhorst NVIDIA GRID Senior Solution Architect
Hello notrobb,

please follow the installation instructions at http://www.vmware.com/files/pdf/techpaper/vmware-horizon-view-graphics-acceleration-deployment.pdf and note that no vib driver is required for vDGA (GPU Passthrough). There is a vib driver for vSGA and vGPU but none is necessary for vDGA

Let me know if that helps.

Thanks,
Erik Bohnhorst

NVIDIA GRID Senior Solution Architect

#2
Posted 05/28/2015 03:28 PM   
Hi Eric, Thanks for your reply. We installed the .vib drivers because some testers want to switch back and forth between vSGA, vDGA and vGPU. Should I remove them for vDGA to work? Also, just to confirm, the Horizon View docs should work if I'm not using Horizon View?
Hi Eric,

Thanks for your reply. We installed the .vib drivers because some testers want to switch back and forth between vSGA, vDGA and vGPU. Should I remove them for vDGA to work?

Also, just to confirm, the Horizon View docs should work if I'm not using Horizon View?

#3
Posted 05/28/2015 04:33 PM   
Hi notrobb, what do you mean by "I'm not using Horizon View"? Please uninstall all vib files off of ESXi. Once vDGA works, we can get those running as well. Thanks, Erik NVIDIA GRID Senior Solution Architect
Hi notrobb,

what do you mean by "I'm not using Horizon View"?

Please uninstall all vib files off of ESXi. Once vDGA works, we can get those running as well.

Thanks,
Erik

NVIDIA GRID Senior Solution Architect

#4
Posted 05/29/2015 10:42 AM   
Hi Eric, I mean that I am not installing the VMWare Horizon View Agent on the VM and I am not connecting them to a Horizon View Server. I simply have a VM running on ESXi and am hoping to use vDGA passthrough. For now we're connecting via the console, and then we'll move on to other means. Is this supported? All the documentation about using vDGA and Passthrough seem to explicitly talk about using VMWare Horizon View (which we are not). I'll uninstall the .vibs and see if that helps.
Hi Eric,

I mean that I am not installing the VMWare Horizon View Agent on the VM and I am not connecting them to a Horizon View Server. I simply have a VM running on ESXi and am hoping to use vDGA passthrough.

For now we're connecting via the console, and then we'll move on to other means. Is this supported? All the documentation about using vDGA and Passthrough seem to explicitly talk about using VMWare Horizon View (which we are not).

I'll uninstall the .vibs and see if that helps.

#5
Posted 06/01/2015 04:13 PM   
You cannot use the console once a GPU is assigned to a VM. Currently the vSphere console is not compatible with GPU enabled VM's and has issues with mouse interaction and input via the console. We don't know VMware's plans to update the console. As RDP will not allow accelerated connections, in particular DirectX is completely disabled. It is recommended that you install another remoting solution, such as Horizon VIEW Direct Connection.
You cannot use the console once a GPU is assigned to a VM. Currently the vSphere console is not compatible with GPU enabled VM's and has issues with mouse interaction and input via the console. We don't know VMware's plans to update the console.


As RDP will not allow accelerated connections, in particular DirectX is completely disabled. It is recommended that you install another remoting solution, such as Horizon VIEW Direct Connection.

Jason Southern, Regional Lead for ProVis Sales - EMEA: NVIDIA Ltd.

#6
Posted 06/01/2015 04:22 PM   
[quote=""]We installed the .vib drivers because some testers want to switch back and forth between vSGA, vDGA and vGPU. Should I remove them for vDGA to work? [/quote] You cannot use vGPU and vSGA on the same physical host. They are mutually exclusive at this time.
said:We installed the .vib drivers because some testers want to switch back and forth between vSGA, vDGA and vGPU. Should I remove them for vDGA to work?


You cannot use vGPU and vSGA on the same physical host. They are mutually exclusive at this time.

Jason Southern, Regional Lead for ProVis Sales - EMEA: NVIDIA Ltd.

#7
Posted 06/01/2015 04:24 PM   
do I need to uninstall the Vib from the esxi 5.5, because i just need to enable the Gpu passthrough , so the VM can use directly, what kind of the software do i need to view the VM. Thanks.
do I need to uninstall the Vib from the esxi 5.5, because i just need to enable the Gpu passthrough , so the VM can use directly, what kind of the software do i need to view the VM.

Thanks.

#8
Posted 09/25/2015 09:23 AM   
The .vib doesn't affect passthrough, so you can leave it in place.
The .vib doesn't affect passthrough, so you can leave it in place.

Jason Southern, Regional Lead for ProVis Sales - EMEA: NVIDIA Ltd.

#9
Posted 09/25/2015 10:21 AM   
This is frustrating: I think what they are saying is this: how can we use vDGA or shared GPU on Nvidia Grid K1 WITHOUT HAVING TO USE HORIZON. Here's my take: I can put the K1 card-- haven't tried yet, but am shortly-- into a Windows Server 2012 R2 Hyper V host and within MINUTES start sharing the GPUs with virtual machines. I have already done this using an Nvidia Titan Z, sharing it's 2 GPUS with 24 Hyper-V hosts and like I said, the installation took only MINUTES, not HOURS or DAYs as it does with VMWare Horizon... It's patently ridiculous that VMWare creates such a HUGE stumbling block, or sever requirement (over 4 servers and a full Active Directory deployment) just to share a flipping GPU with virtual machines. Talk about a pain in the butt, especially if all I am trying to do is demonstrate with live hardware to a client who is probably going to say screw VMWare / Nvidia-- we just want to be able to play 720p movies for training on our vms and that is it. On a LAN, not WAN. RemoteFX is so much easier to setup than Horizon View. Horizon view stinks if you have less than 50 virtual machines and simply want to share a very basic GPU experience. So, in short: with VMWARE it takes Hours, even DAYS to do it and with Windows Sever 2012 just MINUTES. Granted, the WAN peformance is horrid with RemoteFX on standard bandwidth, but not everyone expects that. I have spent weeks trying to find resources on how to share a GPU on VMware ESXI 6 without having to deploy Nvidia Grid / Horizon and it comes up short. Is there now way to be able to use Grid K1 profiles right off of a Vsphere Sever with Esxi 6.0 Hypervisor? Do we HAVE to push VMWare's junk Horizon client and use PCOIP even for just a simple 32 station Local Area Network? If so, then state it: say this... YOU HAVE TO SPEND thousands of dollars of VMWare View Horizon in order to use the only way to get shared GPU on VMWARE using PCOIP... it's THE ONLY WAY and YOU HAVE TO SPEND THOUSANDS OF DOLLARS in SERVERS and in VMWARE HORIZON licenes, plus you have to pay for a Grid K1 or K2 Card... so if you want to do simple Multimedia on a LAN, you are better off with Windows Server 2012 and an approved GPU that can be shared on a LAN using RemoteFX... it's blows the doors off of the pain in there costs and installations of Horizon by a GREAT MARGIN. I hoped I am wrong, Nvidia: all I want to do is sell Nvidia Grid to clients who may have 50 desktop computers they want to convert to VDI... they will have remote employees, but simple RDP is fine, or even No Machine for mildly acceptable Multimedia software codec... but c'mon-- Why can't VMWare / NVidia get smart and enable a person like me-- a solutions architect, VDI optimization expert-- to take a basice Vsphere Essentials installation, pop in an Nvidia Grid K1 card, or two of them, or 1 or two Nvidia K2 cards and then use them on a LAN without having to create a flipping DATACENTER of Severs they don't want to even use let alone have! Not everyone wants to use Active Directory. It's overkill over small Lans who have simple needs, but may want scale up... yes, I can install close circuit TV broadcast-- that's not the point. I have clients that want to build basic AB/roll video editing for YouTube videos and share them with other hosts on the network and not upload to YouTube, then share-- and they saw your demo-- they called me.. "We want to use Nvidia Grid!" I'm like, wow it hauls butt-- but then, after finding out you have to run Active Directory with a Horizon View Connection Server gobbling up 10GB RAM and forcing you to route packets through a software gateway is retarded, especially for LAN. Direct connect is probably okay-- but I still have to deploy a full Horizon deployment... so anyway, I've taken a Titan Z on Sever 2012R2 and literally produced 30 virtual machines, 24 of which could pay full screen HD Video simultaneously over a LAN and was able to accomplish that within a day. The horizon deployment. Days. Not good for a demo, not at all.
This is frustrating: I think what they are saying is this: how can we use vDGA or shared GPU on Nvidia Grid K1 WITHOUT HAVING TO USE HORIZON. Here's my take: I can put the K1 card-- haven't tried yet, but am shortly-- into a Windows Server 2012 R2 Hyper V host and within MINUTES start sharing the GPUs with virtual machines. I have already done this using an Nvidia Titan Z, sharing it's 2 GPUS with 24 Hyper-V hosts and like I said, the installation took only MINUTES, not HOURS or DAYs as it does with VMWare Horizon... It's patently ridiculous that VMWare creates such a HUGE stumbling block, or sever requirement (over 4 servers and a full Active Directory deployment) just to share a flipping GPU with virtual machines. Talk about a pain in the butt, especially if all I am trying to do is demonstrate with live hardware to a client who is probably going to say screw VMWare / Nvidia-- we just want to be able to play 720p movies for training on our vms and that is it. On a LAN, not WAN. RemoteFX is so much easier to setup than Horizon View. Horizon view stinks if you have less than 50 virtual machines and simply want to share a very basic GPU experience. So, in short: with VMWARE it takes Hours, even DAYS to do it and with Windows Sever 2012 just MINUTES. Granted, the WAN peformance is horrid with RemoteFX on standard bandwidth, but not everyone expects that. I have spent weeks trying to find resources on how to share a GPU on VMware ESXI 6 without having to deploy Nvidia Grid / Horizon and it comes up short.

Is there now way to be able to use Grid K1 profiles right off of a Vsphere Sever with Esxi 6.0 Hypervisor? Do we HAVE to push VMWare's junk Horizon client and use PCOIP even for just a simple 32 station Local Area Network? If so, then state it: say this... YOU HAVE TO SPEND thousands of dollars of VMWare View Horizon in order to use the only way to get shared GPU on VMWARE using PCOIP... it's THE ONLY WAY and YOU HAVE TO SPEND THOUSANDS OF DOLLARS in SERVERS and in VMWARE HORIZON licenes, plus you have to pay for a Grid K1 or K2 Card... so if you want to do simple Multimedia on a LAN, you are better off with Windows Server 2012 and an approved GPU that can be shared on a LAN using RemoteFX... it's blows the doors off of the pain in there costs and installations of Horizon by a GREAT MARGIN.

I hoped I am wrong, Nvidia: all I want to do is sell Nvidia Grid to clients who may have 50 desktop computers they want to convert to VDI... they will have remote employees, but simple RDP is fine, or even No Machine for mildly acceptable Multimedia software codec... but c'mon-- Why can't VMWare / NVidia get smart and enable a person like me-- a solutions architect, VDI optimization expert-- to take a basice Vsphere Essentials installation, pop in an Nvidia Grid K1 card, or two of them, or 1 or two Nvidia K2 cards and then use them on a LAN without having to create a flipping DATACENTER of Severs they don't want to even use let alone have! Not everyone wants to use Active Directory. It's overkill over small Lans who have simple needs, but may want scale up... yes, I can install close circuit TV broadcast-- that's not the point. I have clients that want to build basic AB/roll video editing for YouTube videos and share them with other hosts on the network and not upload to YouTube, then share-- and they saw your demo-- they called me.. "We want to use Nvidia Grid!" I'm like, wow it hauls butt-- but then, after finding out you have to run Active Directory with a Horizon View Connection Server gobbling up 10GB RAM and forcing you to route packets through a software gateway is retarded, especially for LAN. Direct connect is probably okay-- but I still have to deploy a full Horizon deployment... so anyway, I've taken a Titan Z on Sever 2012R2 and literally produced 30 virtual machines, 24 of which could pay full screen HD Video simultaneously over a LAN and was able to accomplish that within a day.

The horizon deployment. Days. Not good for a demo, not at all.

#10
Posted 12/18/2015 07:26 PM   
Grid vGPU is a feature of vSphere not Horizon, you dont need Horizon or AD or anything else to setup a vGPU profile on a vm, not sure where you got that Horizon was required for a vGPU profile. Please refer to https://www.vmware.com/products/vsphere/compare to understand Hypervisor capabilities by edition. If you refer to the matrix, you will see it is a feature of Enterprise Plus in vSphere, so your plan to use Essentials would not be feasible. You can use whatever protocol you want to remote the screen, but the remoting protocol needs to be able to do 3D - end of story. I would also suggest possibly doing a little more research on the network topology for Horizon as well. You are not forced to route through a software gateway, it is an option (usually not used for internal LAN traffic). Below is a link that shows some of the network paths available: https://blogs.vmware.com/consulting/2014/06/vmware-horizon-6-view-firewall-network-ports-visualized.html As you can see direct communication OR tunneled communication to the desktops from the clients are both available for configuration. V/R, Michael Bauer Sr. Systems Engineer - EUC VMware, Inc.
Grid vGPU is a feature of vSphere not Horizon, you dont need Horizon or AD or anything else to setup a vGPU profile on a vm, not sure where you got that Horizon was required for a vGPU profile.

Please refer to https://www.vmware.com/products/vsphere/compare to understand Hypervisor capabilities by edition. If you refer to the matrix, you will see it is a feature of Enterprise Plus in vSphere, so your plan to use Essentials would not be feasible.

You can use whatever protocol you want to remote the screen, but the remoting protocol needs to be able to do 3D - end of story.

I would also suggest possibly doing a little more research on the network topology for Horizon as well. You are not forced to route through a software gateway, it is an option (usually not used for internal LAN traffic). Below is a link that shows some of the network paths available:

https://blogs.vmware.com/consulting/2014/06/vmware-horizon-6-view-firewall-network-ports-visualized.html

As you can see direct communication OR tunneled communication to the desktops from the clients are both available for configuration.

V/R,
Michael Bauer
Sr. Systems Engineer - EUC
VMware, Inc.

#11
Posted 12/26/2015 11:26 PM   
Duplicate - removed
Duplicate - removed

#12
Posted 12/26/2015 11:27 PM   
@HelpMe2000 To add to Michael's comments. vGPU in vSphere can be used by any solution that will run on top of vSphere, so if you want to use a VDI solution other than Horizon then you can, as long as that remoting solution supports it. We have many customers using alternative solutions so you are not forced to use Horizon if it is not required. If you wish to use RemoteFX, then you will need to comply with the requirements set down by Microsoft for hardware acceleration. Those are something that neither Nvidia or VMware can control. As an aside, in my experience, it takes a matter of hours to build a full Horizon deployment from scratch, less if the AD, DNS, SQL etc are already in place. It's just a matter of experience and knowledge, not a limitation in the product set. There is a complete deployment guide written jointly by VMware and Nvidia based on deploying K1/2 with vSphere 6 and this takes you through every step required for a complete deployment. Following this to the letter, most people can build a PoC in under a day. https://www.vmware.com/files/pdf/products/horizon/grid-vgpu-deployment-guide.pdf This is currently being updated to GRID 2.0 with the latest updates from VMware. Windows Server uses a different approach to Nvidia vGPU and doesn't give the VM's direct access to the GPU. It uses an API interception mechanism which translates graphics calls from VM to host. This may be easier for you to implement, but it provides less control over resources, requires system RAM for the graphics, delivers lower performance, and lower levels of API support (specifically it lacks full OpenGL support) when compared to a vGPU deployment on the same host.
@HelpMe2000

To add to Michael's comments.

vGPU in vSphere can be used by any solution that will run on top of vSphere, so if you want to use a VDI solution other than Horizon then you can, as long as that remoting solution supports it. We have many customers using alternative solutions so you are not forced to use Horizon if it is not required.

If you wish to use RemoteFX, then you will need to comply with the requirements set down by Microsoft for hardware acceleration. Those are something that neither Nvidia or VMware can control.

As an aside, in my experience, it takes a matter of hours to build a full Horizon deployment from scratch, less if the AD, DNS, SQL etc are already in place. It's just a matter of experience and knowledge, not a limitation in the product set.

There is a complete deployment guide written jointly by VMware and Nvidia based on deploying K1/2 with vSphere 6 and this takes you through every step required for a complete deployment. Following this to the letter, most people can build a PoC in under a day.


https://www.vmware.com/files/pdf/products/horizon/grid-vgpu-deployment-guide.pdf


This is currently being updated to GRID 2.0 with the latest updates from VMware.

Windows Server uses a different approach to Nvidia vGPU and doesn't give the VM's direct access to the GPU. It uses an API interception mechanism which translates graphics calls from VM to host. This may be easier for you to implement, but it provides less control over resources, requires system RAM for the graphics, delivers lower performance, and lower levels of API support (specifically it lacks full OpenGL support) when compared to a vGPU deployment on the same host.

Jason Southern, Regional Lead for ProVis Sales - EMEA: NVIDIA Ltd.

#13
Posted 12/30/2015 11:05 AM   
Hello, in the newest version of ESXi you cannot passthrough the GRID cores to vms! Means vDGA does not work anymore. Is this a nVidia or a VMWare problem? In 5.5 everything was working fine. We have opend a case at vmware, but the engineers does not find a solution.
Hello,
in the newest version of ESXi you cannot passthrough the GRID cores to vms! Means vDGA does not work anymore. Is this a nVidia or a VMWare problem? In 5.5 everything was working fine. We have opend a case at vmware, but the engineers does not find a solution.

#14
Posted 02/10/2016 12:32 PM   
BUMP
BUMP

#15
Posted 04/04/2016 10:36 AM   
Scroll To Top

Add Reply