NVIDIA
How is latency measured?
Hi there, curious developer here. I was reading an article on Tech Radar about the Grid. It's from 2013 but I would imagine it's still relevant today. http://www.techradar.com/news/computing/servers/how-nvidia-grid-is-set-to-revolutionise-cloud-gaming-1167986 In particular, I'm curious about this graph: http://cdn1.mos.techradar.futurecdn.net//art/internet/Nvidia/Grid/nvidiagrid%20(6)-650-80.jpg How exactly was lag being measured? I see the Grid solution had a latency of 161 ms while the GeForce PC had one of just 65 ms. Are you measuring how long it takes each frame to complete from input device, through the update/render pipeline and to display? Cheers
Hi there, curious developer here.

I was reading an article on Tech Radar about the Grid. It's from 2013 but I would imagine it's still relevant today.

http://www.techradar.com/news/computing/servers/how-nvidia-grid-is-set-to-revolutionise-cloud-gaming-1167986


In particular, I'm curious about this graph:

http://cdn1.mos.techradar.futurecdn.net//art/internet/Nvidia/Grid/nvidiagrid%20(6)-650-80.jpg


How exactly was lag being measured? I see the Grid solution had a latency of 161 ms while the GeForce PC had one of just 65 ms. Are you measuring how long it takes each frame to complete from input device, through the update/render pipeline and to display?

Cheers

#1
Posted 01/26/2015 08:26 PM   
Scroll To Top

Add Reply