Realistic Remote Rendering for VR and Mobile Devices

Wired VR headsets provide high visual quality, but restrain the user's movement by being connected to PC with cable, and untethered headsets have only mobile GPU, which has relatively low performance. We focus on providing smooth VR experience without restriction on movement: high refresh rate and immediate effect of head movement on the rendering to prevent motion sickness. Video streaming of rendering provides high refresh-rate and quality but can also have high latency. In our approach, the scene is rendered on the server to multiple layers using depth peeling, packed to texture, and with a potentially visible set of triangles streamed to the client. This method supports temporal frame up-sampling and provides low latency. Results show that it is a potential alternative to existing image-based methods and atlas streaming approaches.