Overview
The VR industry is evolving rapidly. Mass production of VR devices now appears likely in the near future. Early deployments will primarily target high-end gaming, but applications are expected to expand quickly. Before VR becomes mainstream, it is important to examine the technical challenges it currently faces.
Motion-to-photon latency
Latency here means the time it takes for the system to convert actual head movement into the image you see on the VR headset. These two events must occur very close together for perception to feel natural. If latency is too large or inconsistent, the immersive experience becomes unnatural and can cause nausea or dizziness. Research indicates that motion-to-photon latency must remain below about 20 milliseconds (ms) for a smooth, natural VR experience. With a standard 60 Hz refresh rate, this implies a 16 ms budget per frame. Although this target is challenging, appropriate techniques can make it achievable.
Reducing latency: key approaches
Combining specific techniques can produce a low-latency VR system. First, consider front-buffer rendering. Graphics applications, including those on Android devices, typically use double or triple buffering, where the GPU renders to an off-screen buffer that is then swapped with the on-screen buffer at each display refresh. This smooths frame timing but increases latency, which is undesirable for VR. Front-buffer rendering bypasses the off-screen buffer and renders directly to the on-screen buffer to reduce latency. This approach requires precise synchronization with the display so GPU writes always complete before the display reads those pixels. Mali GPU priority extension features can accelerate GPU task scheduling so that front-buffer rendering receives higher priority than less urgent tasks, helping to improve the interactive experience.
Eliminate extra buffered rendering to reduce latency
The second important consideration is selecting the right display type. OLED displays differ significantly from LCDs with LED backlights. Using a thin-film transistor array behind the panel, each pixel on an OLED display acts as its own light source, whereas an LCD uses a white LED backlight. OLED brightness is determined by current through the thin-film, and color is controlled by individually driving red, green, and blue subpixels. This per-pixel control enables high brightness, contrast, and color saturation, and allows much deeper blacks by turning off pixels entirely.
For VR, this per-pixel control is critical because it enables low persistence. Full-persistence displays keep the screen lit continuously, so a correct image is only momentary and quickly becomes stale. Low-persistence displays illuminate the image only when the view is correct and then turn the pixels off. At very high refresh rates this is difficult to perceive and creates the illusion of a continuous image. Low persistence also reduces motion blur and allows the display to show multiple partial images within a single refresh, adjusting intermediate frames based on headset sensor data. When the user's view sweeps across the screen, the displayed image can be updated to reflect the changed head pose. LCD panels with full backlights cannot achieve this behavior. Therefore, a key to low-latency VR is using time-warp-like processes to render front-buffer updates in tiled or strip-based segments and drive an OLED screen, letting the image rapidly adapt to head rotation.
Asynchronous time-warp technique
Another essential technique is asynchronous time-warp. Because scene changes in immersive VR are often gradual, the image differences between consecutive frames are small and more predictable. Time-warping refers to shifting a previously rendered image to match a new head pose. This technique can partially decouple application frame rate from display refresh rate, reducing perceived latency for certain scenarios. The warping compensates for head rotation but does not handle head translation or dynamic scene animation. While time-warp is a pragmatic workaround, it effectively enables systems running at, for example, 30 FPS to provide head-tracking responsiveness comparable to 60 Hz or higher.
Multimedia synchronization for VR
Integrating GPU and display deeply is only part of the problem. Playing video (including DRM-protected content) and integrating system notifications add complexity. High-quality VR requires strong multimedia synchronization and efficient use of bandwidth to provide a good user experience while optimizing power and performance. Technologies such as ARM Frame Buffer Compression (AFBC) and ARM TrustZone, together with the ARM Mali multimedia suite (MMS), support deep integration of GPU, video, and display processors and provide tools to address synchronization and bandwidth efficiency in VR systems.
ALLPCB