Turn your smartphone in portrait mode
to consult our site correctly, thank you!
With CES 2024 in the rear-view mirror, both AR glasses and AR Head Up Displays (HUDs) were quite evident in many booths. So how is the technology similar and how is it different? Let’s take a look.
As summarized in the table below, there are more similarities than differences between optical see-through AR glasses and AR HUDs. For example, many types of imager technology can be used for both products. LCD, DLP, LCOS and OLED devices create an image that is magnified by optics, while scanning lasers paint a series of light points to create an image. Spatial Light Modulators (SLMs) are different in that a diffraction pattern, not an image, is written to this device and illuminated by lasers or LEDs. This creates a true holographic image that is magnified and presented to the user.
OLEDs and microLED panels are self-emissive, but the other approaches need a light source, which is either RGB lasers or LEDs. Here, the requirement for the luminance range of the solution varies between these two classes of device. In an automotive HUD, the image must be visible in bright sunlight, but very dim for use on a dark night without any image “glow”. AR glasses may be used in similar environments, but are generally considered for dim, not dark environments, so a reduced luminance range.
The combiner consists of optical elements that enable a displaced virtual image to be overlaid onto the real world. While there are a range of configurations to achieve this, there are three basic classes of components.
Holographic optical elements, or HOEs, are nanometer-sized structures that can capture, expand, magnify, and manipulate light. In AR glasses, they help couple light into and out of a waveguide, for example, while in automotive, they may be placed on the windshield to diffuse or magnify an image. Waveguides are popular in AR glasses as the medium for transferring an image from the light source to the eye, but they are also being considered in automotive HUDs as well. Finally, tiny mirrors can be embedded in waveguides for AR glasses, but much larger mirror systems are very popular for automotive HUDs. This is one of the key differences as the size of these mirrors in automotive HUDs creates a much larger volume solution with more weight than AR glasses.
In terms of the images these devices can create, they are similar in that they can be 2D, stereoscopic 3D or true holographic images at a single depth plane or two depth planes. Eye tracking is optional for each type of device as well. AR glasses have a need to be able to see, and potentially manipulate virtual objects at close distances – within arm’s length. Automotive HUDs typically offer the most value when the images are presented a large distance from the driver so that they can keep their focus at close to infinity. Eliminating the look down time and refocusing on an in-car instrument cluster is the key value proposition of a HUD as it is a safety enhancement. Nevertheless, some automakers are proposing HUDs that do not provide much of a virtual distance, thus compromising on their inherent value proposition.
One of the big differences between auto HUDs and AR glasses is the eyebox and eye relief requirement. With AR glasses the distance between the combiner and eyes is small and well defined, while the eyebox, the area in which the image is visible, is likewise fairly small.
With an automotive HUD, the distance between the combiner (the windshield) and the driver is much longer and quite variable. In addition, the location of the driver with respect to the combiner can vary quite a bit as well. This creates the requirement for a fairly large eyebox, so the HUD image is always visible. The larger the eyebox, the larger the optics need to be. And as the eyebox gets bigger, more raw laser or LED power is needed to maintain the same level of illumination. This in turn has an impact on the electrical power and thermal management needs of a HUD, which far exceed those of AR glasses.
Some AR glasses designs allow the user to wear their prescription eyeglasses or optical inserts, but none can deliver 20/20 resolution. Plus, many AR headsets require external batteries and computer sources, like a smartphone for full functionality and connectivity to the outside world (maybe even wires between components). Automotive HUDs allow users to wear prescription eyewear and all the connectivity and processing is built into the car, so much less encumbering than AR glasses.
While most HUDs are still single-plane solutions, AR-HUD solutions are arriving that provide two planes and in the future multiple-planes and holographic continuous depth. Such AR-HUD designs are far more complex than simple or basic HUDs as they need to precisely register digital objects to the real world – and maintain that registration as the car moves forward and the driver moves their head. While there is a need to register virtual images to the real world with AR glasses, the person’s movement is more limited and the combiner/eyes distance remains fixed, so a much easier problem to solve.
The complexity of the registration issue, especially if eye tracking is included, adds to the compute budget with optical trade-offs. For example, distortions in the optical system can be corrected with optical elements, but typically with an increase in cost and weight. Digital optical correction is possible as well, but with the loss of resolution in image-based AR glasses and HUDs. The exception here is HUD based on holographic spatial light modulator technology. Since such systems write a diffraction pattern optical distortion correction can simply be added to the diffraction pattern without loss of resolution.
In summary, while there are many similarities between AR glasses and Automotive AR-HUDs, the greater distance between the user and combiner means the AR-HUD solution is much larger, heavier and consumes more power. Plus, the difference in registering virtual objects to the real world means more processing power is needed in AR-HUDs.