NVIDIA
Proposed talks!
Want to ask a question of the developers, post here!
Want to ask a question of the developers, post here!

Regards,

Luke Wignall
Performance Engineering Manager
NVIDIA | Worldwide Sales ­ GRID Computing
http://www.linkedin.com/in/lukewignall/
https://twitter.com/lwignall

#1
Posted 10/05/2014 06:23 PM   
Hello, I saw a video on Youtube, where Jason Southern has connected an Oculus Rift with a Grid Server. We are trying to do something similar and would you to ask some questions about this. We have a Grid K1 server here and trying to realize this solution on a native headless Linux Server without a hypervisor solution like XenServer etc. So the first thing I want to ask is, how do you handling latency with the Rift? In VR Application, latency is more important than in usual Application, so in best case, you would have 30ms latency for the Server, 30ms for the networking and 16 on client side, without the Software Overhead for getting Position from the Rift Device to the Server. I know the video from Jason Southern is not quite long, but I am interested in if he uses a shim layer Application for Encoding, or does he instrumentalizes the Oculus Application to get informations needed? We want to try some Optimizations for Frameprediction so where would you cut the Application to realize something like that? Best regards
Hello,

I saw a video on Youtube, where Jason Southern has connected an Oculus Rift with a Grid Server. We are trying to do something similar and would you to ask some questions about this.
We have a Grid K1 server here and trying to realize this solution on a native headless Linux Server without a hypervisor solution like XenServer etc.

So the first thing I want to ask is, how do you handling latency with the Rift? In VR Application, latency is more important than in usual Application, so in best case, you would have 30ms latency for the Server, 30ms for the networking and 16 on client side, without the Software Overhead for getting Position from the Rift Device to the Server.

I know the video from Jason Southern is not quite long, but I am interested in if he uses a shim layer Application for Encoding, or does he instrumentalizes the Oculus Application to get informations needed?
We want to try some Optimizations for Frameprediction so where would you cut the Application to realize something like that?

Best regards

#2
Posted 03/19/2015 02:59 PM   
How to register in Ukraine?
How to register in Ukraine?

#3
Posted 02/20/2016 02:45 PM   
[quote="yalcinK"] I saw a video on Youtube, where Jason Southern has connected an Oculus Rift with a Grid Server. We are trying to do something similar and would you to ask some questions about this.[/quote] I didn't build it, the Occulus setup was actually by Magnar Johnsen, and nothing was done to minimise latency beyond optimising hte network. Magnar has a blog here: https://www.virtualexperience.no/ Latency is a huge issue with VR, and there's a lot of research going into how to enhance the experience using remote protocols.
yalcinK said:

I saw a video on Youtube, where Jason Southern has connected an Oculus Rift with a Grid Server. We are trying to do something similar and would you to ask some questions about this.


I didn't build it, the Occulus setup was actually by Magnar Johnsen, and nothing was done to minimise latency beyond optimising hte network.

Magnar has a blog here:


https://www.virtualexperience.no/


Latency is a huge issue with VR, and there's a lot of research going into how to enhance the experience using remote protocols.

Jason Southern, Regional Lead for ProVis Sales - EMEA: NVIDIA Ltd.

#4
Posted 02/21/2016 01:08 PM   
Maybe this video by Magnar. https://www.citrix.com/blogs/2014/05/23/is-running-xendesktop-remotely-on-a-server-8000km-away-with-700ms-latency-on-an-airplane-with-an-oculus-rift-virtual-reality-application-insane/ This was done on the oculus v1, later versions have been problematic to get working as oculus changed some stuff that made it less citrix/VMware friendly. In v1, support just happened by multi-monitor support. Latency per se isn't an issue depending on what you are doing - in this case with magnar on 700ms was fine because he was essentially watching video, a rollarcoaster simulation. Video is very tolerant, can be buffered. Interactive shoot-em-up or CAD will be harder as you can't fight the speed of light (although Citrix's new USB on WAN - see Wacom support has some support that could be exploited in these scenarios to some extent)
Maybe this video by Magnar. https://www.citrix.com/blogs/2014/05/23/is-running-xendesktop-remotely-on-a-server-8000km-away-with-700ms-latency-on-an-airplane-with-an-oculus-rift-virtual-reality-application-insane/


This was done on the oculus v1, later versions have been problematic to get working as oculus changed some stuff that made it less citrix/VMware friendly. In v1, support just happened by multi-monitor support.

Latency per se isn't an issue depending on what you are doing - in this case with magnar on 700ms was fine because he was essentially watching video, a rollarcoaster simulation. Video is very tolerant, can be buffered. Interactive shoot-em-up or CAD will be harder as you can't fight the speed of light (although Citrix's new USB on WAN - see Wacom support has some support that could be exploited in these scenarios to some extent)

#5
Posted 02/21/2016 08:42 PM   
Scroll To Top

Add Reply