Doubling Down: Design-Side Issues of Double Patterning
As we delve into the details of double patterning, some of the obstacles begin to look familiar. Double-patterning issues are very similar to those encountered when strong phase shift masks (s-PSM) first came around, and we had the ability to insert phase shift windows that enabled gate shrinks. However, in s-PSM, the number of structures that had to undergo this process was small compared to the total number of structures in the complete layout. In double patterning, not only are there many more structures that must be decomposed, but these structures are often longer in construction, increasing the likelihood of conflicts. However, the lessons we've learned from s-PSM and lithographic control have helped define a set of restrictions that can make double patterning viable.
The challenge for double patterning is to identify two things. First: Patternability -- a set of rules that tells you whether or not a layout can be safely decomposed in isolation. Second: Composability -- a set of placement and routing rules or algorithms that guarantee that the complete layout remains decomposable.
Patternability applies during layout construction, such as standard cell design and routing style definition. Patternability is a relatively easy problem to solve and can generally be taken care of by the place-and-route tools once the rules are established, because it addresses intrinsic effects, which are local in nature.
In comparison, composability issues require layout designers to recognize that new configurations can be created by placing structures next to each other. A typical example is during standard cell placement when multiple IP blocks are integrated to form a design. Each layout block can be double-patterning compliant by itself, but when placed next to another, the combination may create structures that cannot be adequately decomposed for a double-patterning process.
The reactive, or "brute force," approach to double patterning, which we're using now, is to determine only if structures are patternable. Composability is not addressed during layout, as everything is checked when placement is complete. Obviously, if the layout construction fails, we have to go through placement all over again, until we finally succeed.
The proactive approach is to determine composability and enforce it during placement. A simple composability rule would be to spread all elements sufficiently far from each other, but that reduces the cost benefit of migrating to a more advanced process node. To make double patterning more cost-effective, IP providers must provide the characteristics of the cells, so the placer tool can prevent forbidden configurations in an efficient way.
The design-side requirements of double patterning are simple in concept, but how they will get executed is challenging. It requires communication between the manufacturers and the design teams. If we take the time to figure out these rules and create composability algorithms that eliminate the brute force approach, then we won't incur any additional time on the design side to implement double patterning. The main challenge is to get the entire necessary infrastructure ready for use, meaning everyone is working on the decomposition rules and defining those structures we simply can't make.
We remain optimistic that it is possible to minimize double patterning issues. Double-patterning is certainly more feasible than relying on alternative 22nm process technologies. Nanoimprint is still in its infancy, it looks as though EUV will be late, and while there are some other promising techniques out there, they come with their own set of limitations. So, as we see it, the most likely candidate for 22nm designs is double patterning. Designs that are ready to utilize double patterning in the most effective way by minimizing area penalties will have a cost advantage, and a better chance of first working silicon.
Andres Torres is currently the technical lead of the Litho-Friendly Design group at Mentor Graphics.