Published in January / February 2008 issue of Chip Design Magazine
With the release of the new all-digital Audio/Video Display Port standard, designers may be tempted to question why the world needs yet another video interface. After all, the goal is to consolidate to a single standard. Display Port, however, corrects the shortcomings of the incumbent HDMI and DVI standards. By supporting this new interface, integrated-circuit (IC) -based graphics and video chipsets also can maintain backward compatibility to legacy standards until Display Port monitors are widely available.
Since the death of UDI, there are now three digital video standards fighting for socket position in the digital video market. In regards to the consumer market, HDMI has taken a tremendous lead—one that doesn’t look like it will falter anytime soon. In the personalcomputer (PC) market, HDMI has had a slow start behind the dominant DVI. Even given those market leads, however, Display Port is being enthusiastically received. It is anticipated to completely overtake HDMI and DVI within two years.
The chief reason for Display Port’s wide industry support is its direct liquid-crystal-display (LCD) drive capability. Direct drive means that an external audio/video signal can be used to directly drive an LCD panel in a way similar to the historic VGA monitor. This scheme has the advantage of enabling monitor designers to eliminate high-cost scalars, thereby reducing overall system cost. Because monitors are primarily targeted for PCs, one will initially find Display Port connectors in the PC world. In fact, designers of graphics and video chipsets are already moving from native TMDS-based HDMI and DVI to more cost-effective Display Port interfaces. Their goal is to capitalize upon its advantages.
PC-chipset designers also are finding substantial design advantages arising from the lower-voltage swing of Display Port signals compared to the higher-voltage levels required for HDMI/DVI signals. Because monitors have a longer lifecycle compared to computers, though, market dynamics dictate a lag in native Display Port support in monitors. Until the migration to Display Port in monitors is complete, system manufacturers must implement bridging technology. Such technology will enable Display Port-based graphics chips to operate with legacy HDMI and DVI-based displays, such as projectors, monitors, or DTVs.
How and where bridging is partitioned—as well as whether to fully integrate bridging or use an external device—can have a significant impact on system cost, viability, and reliability. To maximize performance while achieving the best cost efficiencies, chip designers must consider how signal conversion will occur, the impact of highvoltage I/O transistors on process technology, partitioning options, maintaining signal integrity, and accounting for the required level of electrostatic-discharge (ESD) protection.
There are two primary components of a display interface: the physical electrical signal and the signal’s logical protocol encapsulation. Given the anticipated delay in the availability of native Display Port monitors, Display Port-enabled systems will need to be able to bridge to HDMI and/or DVI for the immediate future. But supporting both the physical and logical components of multiple interfaces in a single chipset isn’t an attractive option. After all, such an approach introduces both added complexity and cost burdens. Integrating three interfaces, for example, would require three sets of output ports/pins with each interface requiring up to 11 pins. Alternatively, designers could use an internal multiplexer to output all three interfaces to the same port (see Figure 1a). The electrical interface for the combined port would need to remain the same, though. As a result, an external electrical-conversion device would be required.
Another option for designers would be to build a chipset with a single Display Port output that is converted both logically and electrically to HDMI or DVI via an external full bridge (see Figure 1b). Yet full bridges significantly impact bill-of-materials (BOM) cost, as both the electrical signal and logical protocol must be processed and then converted.
Perhaps the most important factor to consider when evaluating which approach to take is the difficulty in implementing a highvoltage, high-speed interface using advanced process technologies like 45 nm. To keep die sizes down, minimize current consumption, and increase processing speeds, IC vendors must utilize these advanced process technologies. They are therefore forced to work with the limitations that accompany them. For instance, one major limitation with 45-nm process technology is its 2.5-V maximum transistor support at the IC’s I/Os. This issue becomes critical because the HDMI/DVI standard requires 3.6-V support on the high-speed pins and up to 5.25 V on the low-speed sideband signals. Consequently, an HDMI port cannot be offered in an IC using 45-nm process technology without resorting to specialized and proprietary design measures. Such measures could increase die size, complexity, and cost.
Clearly, the use of proprietary technology drives up the cost of the electrical portion of the HDMI interface. Yet it also raises the expense of manufacturing the entire graphics or video chipset. Ideally, the high-voltage I/O transistors necessary for HDMI are more cost effectively implemented using a 0.25-micron process technology. Separating the logical and physical components of the HDMI/ DVI interface enables each to be implemented in the most cost-effective process technology for the transistors required.
A key consideration is that the benefits of separation do not play a relevant role in the design of Display Port silicon. High-speed Display Port electrical signals will never rise above 2 V (compared to TMDS’ 3.6 V). Display Port circuitry is therefore efficient to implement in the most cost-effective process for the application as a whole, 45 nm.
From this perspective, designers who want to balance signal integrity, complexity, and cost have the option of implementing both the logical and physical components of the Display Port interface and only the logical protocol for HDMI/DVI within the graphics/ video chipset. To connect to an HDMI monitor, the chipset is configured to generate the HDMI logical protocol and encapsulate it within a Display Port electrical signal. That signal will be converted to an HDMI electrical signal by an external bridge (see Figure 1c). Given that the chipset is designed for processing performance, the Display Port, HDMI, and DVI logical interfaces are most cost effectively implemented here. By implementing the appropriate HDMI or DVI interface in an external bridge, the electrical conversion— and high-voltage I/O transistors necessary for TMDS—can be implemented at their lowest cost.
Separating the logical and physical components of the HDMI/ DVI interface using a bridge decreases overall cost. It also minimizes latency and jitter because the signal doesn’t require a logical conversion. Nevertheless, it can be argued that employing an external bridge introduces complexity to a system as well as undesirable jitter. After all, each discrete IC in a system increases routing complexity while adding another source of jitter and noise that must be contained. By taking a system perspective, however, it can be seen that an external bridge need not burden designers in this way. Consider that any interface exposed to the real world—especially consumer-electronics applications—must protect itself from electrostatic discharge. Often, this goal is accomplished using a discrete ESD protection circuit. Specifically for display applications, the HDMI standard revision 1.3 requires 8000-V protection at the interface connector. At this point in time, it isn’t clear whether it is feasible to attempt to provide 8000-V ESD protection at 45 nm. Yet there are more pressing reasons not to integrate this function onto a graphics or video chipset. If the ESD protection circuit fails for some reason, it is the first component behind the connector that takes the damage. If ESD protection is integrated into the display processor, the processor itself will be damaged. If the ESD damage is too high, the system will be brought down. In addition, it may not be economically possible to repair the problem by replacing the damaged processor. The entire system will then have to be scrapped.
When ESD protection is provided by an external device, this discrete component bears the failure and makes it possible to recover the system much more cost effectively. The problem with ESD devices, however, is that they are passive components that add unwanted capacitance and distort any signals through which they pass. Depending on the application, they also add a system cost of approximately 30 cents per port.
Because ESD protection is required in any case to protect the main graphics or video chipset, there are already a minimum of two ICs in the signal path. By integrating the bridge and ESD protection circuit together rather than implementing them as separate devices, the number of ICs in the signal path remains at two. In terms of HDMI and DVI bridging specifically, ESD protection is a well-established technology at 0.25 microns. Also, an integrated approach absorbs the cost of the protection circuit better than implementing it as a discrete component.
From a signal-quality perspective, a discrete ESD protection device would interfere with signal pre-emphasis while degrading signal quality. A combined bridge/ESD device, on the other hand, eliminates both the external ESD protection circuit losses and jitter. After all, pre-emphasis circuitry can be integrated with the ESD circuit (see Figure 2). If the ESD protection circuit does happen to fail, it will cause the Display Port/HDMI bridge to fail, but not the system. The bridge can be replaced as easily as a standalone ESD device.
Using an external electrical bridge offers the most cost-effective way to implement a DP-to-HDMI bridge. Yet designers also need to consider the different partitioning options available and how they impact signal quality. The external electrical bridge IC can be placed on the system board, within a docking station, on a cable adapter, or within the monitor itself (see Figure 3). As discussed previously, bridging within the graphics or video chipset isn’t feasible from a cost, power, and process-technology perspective.
Bridging on the system board (internal bridging) offers the most reliability. Yet bridging using a cable adapter (external bridging) offers the most flexibility. Internal bridging fixes the display interface so that users cannot be confused about what monitors are supported or how they’re connected. This is especially important for desktops sold to the consumer market, where plugging in a monitor must be foolproof. A cable adapter, on the other hand, enables laptop users to connect to existing HDMI monitors as well as Display Port monitors (once they become available).
Of these three options, bridging within a laptop docking station is the most limited. Docking stations are utilized by only a small percentage of laptop users. Given potential volumes, it may be more cost effective for docking-station manufacturers to use a dongle than provide internal bridging.
Where bridging takes place determines the number of connectors over which a signal must pass and still remain reliable. In the case of cable adapters and docking stations, the bridged signal must pass through a connector on either side. Yet an internal bridge only passes over one connector on the way to the monitor. Each connector reduces signal integrity and the overall distance that the signal can be driven.
In either instance, bridges present an opportunity to improve signal quality through well-established signal-enhancement technologies. On the bridge’s receive side, equalization can be used to eliminate the jitter that was introduced as a result of the distance between the graphics/video chipset and the bridge. On the bridge’s transmit side, pre-emphasis can be added to the signal. It will enable the signal to be driven over longer distances and additional connectors. Figure 4 shows a completely closed eye passing through jitter-elimination circuitry. Distance and the number of connectors is no longer a concern when using a proper electrical bridge with ideal jitter-elimination circuitry.
Employing both equalization and pre-emphasis in the bridge is a key factor in simplifying board design and layout. The elimination of jitter provides a higher eye margin over longer traces, thereby opening the eye and ensuring that the final signal will meet specified integrity constraints. The signal can be adjusted based on the application- specific characteristics of the interface. As a result, system designers will have more freedom in graphics/video chipset and connector placement because they can compensate for deterministic jitter and noise. Rather than having to minimize the chipset-to-connector distance as a key factor affecting signal integrity, chip placement becomes a matter of overall convenience. The addition of an external bridge then simplifies rather than complicates design.
Once Display Port is established as the dominant monitor/projector interface, there will be no need to bridge to legacy HDMI or DVI displays. In the meantime, system designers need to design ICs that minimize their costs today while leaving their options open to reduce costs again tomorrow. Moving to a native Display Port interface within graphics and video chipsets is the first step toward enabling these cost benefits. By employing external bridges for electrical conversion, designers retain maximum performance at 45 nm. At the same time, they avoid burdening these chipsets with the extra process layers—and cost—required to support TMDS. These same chipsets will be able to support native Display Port monitors without a costly respin. With proper bridge design, jitter also can be eliminated through equalization and pre-emphasis technology while optimizing signal integrity. At the same time, system designers gain maximum flexibility with regards to chipset, bridge, and connector placement.