Introduction
Virtual reality (VR) is an important branch of simulation technology. It integrates simulation, computer graphics, human-computer interface, multimedia, sensing technology, and networking, and is a challenging interdisciplinary research field.
VR mainly comprises simulated environments, perception, natural interaction, and sensing devices. A simulated environment is a computer-generated, real-time, dynamic, three-dimensional realistic image. Perception refers to the idea that an ideal VR system should provide the range of perceptions humans have. In addition to visual perception generated by computer graphics, there are auditory, tactile, force, and motion perceptions, and in some cases even smell and taste, collectively known as multisensory perception. Natural interaction covers head rotation, eye movement, gestures, and other human actions that the computer processes and maps to corresponding data, providing real-time responses and feedback to the user’s senses. Sensing devices refer to three-dimensional interaction hardware.
Augmented Reality
Augmented reality (AR) overlays virtual content onto the real world, integrating real-world information and virtual information in a seamless way. It applies virtual information to the physical world so that the combined result can be perceived by human senses, enabling enhanced sensory experiences. Real environments and virtual objects are superimposed in the same view or space in real time.
1. Motion Capture
To achieve full immersion and feel truly "inside" a virtual world, motion capture systems are often required. For VR-specific motion capture, products such as Perception Neuron are available. Other systems on the market tend to be either expensive commercial-grade devices or vaporware. Such motion capture devices are typically used only in highly specialized scenarios because they have usability barriers, requiring significant time for donning and calibration. By contrast, optical devices like Kinect may be used in scenarios where high precision is not required.
Full-body motion capture is not necessary in many cases. Another issue is the lack of feedback: users have difficulty perceiving whether their actions are effective, which is a major interaction design challenge.
2. Haptic Feedback
This category mainly includes button and vibration feedback, typically provided by VR controllers. Major VR headset makers such as Oculus, Sony, and HTC/Valve have standardized on handheld controllers as a primary interaction mode: two separate controllers, full six degrees of freedom tracking (three rotational and three translational degrees), with buttons and vibration feedback. These devices are well suited to specialized gaming applications and light consumer use, which aligns with the early VR consumer base being largely gamers.
While these simplified, specialized controllers are effective for game interactions, they do not adapt well to a broader range of application scenarios.
3. Eye Tracking
Eye tracking is one of the most important technologies in the VR field. Palmer Luckey has described it as "the heart of VR" because accurate eye position detection can provide the optimal 3D effect for the current viewpoint, make rendered images appear more natural, and reduce perceived latency, all of which increase playability. Eye tracking can also determine the actual gaze point on virtual objects, yielding depth information for the gaze location. Many VR practitioners consider eye tracking a potential breakthrough for addressing VR-related motion sickness. However, despite extensive research, no single company has yet delivered a fully satisfactory solution.
Pei Yun, head of the graphics and imaging algorithm center at SuperD, believes VR eye tracking can be implemented using devices similar to Tobii eye trackers, provided that device size and power consumption issues are resolved. From a technical perspective, eye tracking in VR is feasible, for example by using external power or designing larger headset structures. A larger challenge lies in the image-adjustment algorithms that adapt rendered images to eye movements; these algorithms are currently underdeveloped. Two key metrics are natural realism of the image and low latency. Meeting both requirements would significantly improve VR usability.
4. Electromyostimulation
An example is the VR boxing device Impacto, which combines haptic feedback and electrical muscle stimulation to simulate realistic sensations. The device consists of two parts. One part is vibration motors that produce vibration sensations similar to those in conventional game controllers. The other, more meaningful part is an electrical muscle stimulation system that applies current to induce muscle contraction. Together, these create the illusion of landing a punch on an opponent, generating a convincing "impact" sensation at the right moment.
However, there is debate in the industry about this approach. Current biomedical technology cannot highly accurately reproduce real sensations through muscle stimulation. Even with this method, the achievable sensations are relatively coarse and may not add much to immersion compared with vibration motors. A practitioner in therapeutic pain relief noted that simulating realistic sensations via electrical stimulation faces many challenges because neural pathways are intricate and external skin stimulation is unlikely to replicate them. Still, using electrical stimulation simply to trigger muscle movement as a form of feedback is feasible.
ALLPCB