Collaborative Advantage

Just another Chip Design Blog

Steve SchulzIf you are attending DAC this year, or still thinking about it, then read on — you will not want to miss out on all the special activities Si2 is hosting this year. I’ll give you the executive summary here, then a link to learn more and register at the end.

To begin with, Si2 has a new and improved format at DAC this year. Coordinated in partnership with the DAC powers-that-be, Si2 now has an all-day co-located event on Monday called “Si2 Roundup @ DAC: Standards in Action”. This event includes a smorgasbord of shorter (approx. 90 minute) sessions, designed to give you concise and focused information about some of the active engineering projects currently underway at Si2. The sessions will cover hot topics in the OpenPDK Coalition, DFM Coalition, and our newest project, the Open3D TAB. These sessions will include speakers from such companies as (in no particular order): IBM, GLOBALFOUNDRIES, Samsung, LSI, Synopsys, Qualcomm, Altera, Apache Design, Cadence Design Systems, Mentor Graphics, STMicroelectronics, and Intel. A detailed agenda is located here:

But there’s much more. This year, we are celebrating the 10th anniversary of the largest standardization effort in the history of EDA, OpenAccess, with two special events. First, Si2 is hosting a free luncheon for DAC attendees that will also serve as Si2’s annual open meeting, where we will announce election results for Si2’s esteemed Board of Directors and present a short “state of the union” address (no doubt while folks are enjoying their cheesecake). Si2 is grateful for our luncheon (and reception) sponsors, including Cadence Design Systems, GLOBALFOUNDRIES, LSI, NanGate, and Spectral Design & Test.

But wait – there’s more! This year’s OpenAccess session will take an enjoyable look back at how OpenAccess actually came into existence (which will surprise many) and present both live and taped messages / testimonials from the leaders that have shaped OpenAccess and our industry over the years. We will conclude with special 10-year awards recognizing the amazing technical contributions that made this all possible.

But wait – there’s even more! Following the final session on Open3D, Si2 is hosting an open social networking reception, where we will continue to bask in the celebratory mood with drinks and light hors d’oeuvres for all attendees.

But wait — there’s still more! It is not normally Si2’s modus operandi to engage in marketing giveaways, because we believe the fundamental value of our members’ work will speak for itself to those who truly have an interest. However, this year seemed like the right time to offer giveaways specifically to celebrate this milestone moment together, and remember it for at least a few years ahead. So, in that spirit, Si2 will have some “hot” giveaways at Si2’s booth: “OpenAccess @ 10″ sport shirts and 2GB flash drives. Enjoy!

So, where can you go to sign up for (some or all of) these events? Please click here right now:

On behalf of my entire dedicated staff, I greatly look forward to seeing you at DAC!!

Free counter and web stats

Steve SchulzSteve SchulzThis year, Si2’s OpenAccess standard turns 10… so we are announcing a year of special celebrations to mark this very significant occasion! I will use this blog to reveal some of the plans for this year, so read on.. but first, a little historical perspective.

The origins of OpenAccess reach back to the mid-90’s and the SEMATECH Design FTAB, which envisioned a common data model and interoperable C++ API that would enable high-performance, fully integrated design tool flows, with the flexibility to mix and match best-in-class commercial EDA tools with selected in-house tools. Led by major semiconductor IDMs, several Design FTAB members had already proven the concept’s value with internal technology, but wanted to shift to more commercial EDA tools. The FTAB developed excessively detailed specifications for both the design environment and base set of tools (“Chip Hierarchical Design Standard”), however this was not a good recipe for EDA vendor adoption and failed. Eventually, the concept, with some seed funding, was transferred to Si2 around year 2000.

A new Design API Council (DAPIC) was formed at Si2, which took a decidedly different approach. Si2 realized specs alone were insufficient, and that the EDA industry needed to be an integral partner for any chance of buy-in and adoption. Following DAPIC’s release of a Request For Technology (RFT) to industry, Cadence was the sole vendor to offer a database / API technology in line with the RFT, and even so would need to be re-written in C++ from scratch. The final sticking point was the need for source code, which nearly killed the deal, however some very open-minded people worked together and pushed this through in what was a revolutionary first in EDA. Incidentally, Motorola’s Board representative at that time led the campaign for ’source code’… that person is Dr. Sumit DasGupta, now Si2’s Sr. VP of Engineering (this may very well be another case of, “Be careful what you ask for…”)!

Starting with the DAPIC core members, Si2 formed a new OpenAccess Coalition (OAC) and recruited 12 founding members, which quickly grew to 16 in 2002. After Cadence contributed the first C++ reference implementation (RI), Si2 did some testing and packaging, and released OpenAccess 2.0 on Si2’s website for OpenAccess Coalition members in December 2002, and then to the general public in February 2003. All changes to OpenAccess since that initial contribution are controlled by the OAC’s elected Change Team and owned by Si2 on behalf of the OAC. Si2 has developed user guides and reference manuals, (online) training classes with labs, add-on software utilities, and received numerous contributions of additional software and other technology. Today an “Extension Steering Group” complements the Change Team to support such useful extensions as the new OpenAccess scripting language framework (currently supporting four languages).

OpenAccess has progressed from raw vision to early reality, and crossed the chasm of mass adoption over 5 years ago. It continued to expand in its scope of adoption as it grew in confidence – from SoC, to FPGA, to analog/RF circuits, to memories, to MEMS / sensors, and up to the most advanced microprocessors. Foundries are OAC members because it is part of their reference flows and library development. VC funding for EDA startups has been conditional upon building their product lines on top of OpenAccess for efficiency. Cottage industries (and external standards groups) have emerged around the OpenAccess ecosystem. Researchers are now using OpenAccess for the co-design of silicon photonics with electronics. The sky is the limit and the future is bright.

In the coming months, here are a few things to anticipate as we mark the ten year milestone of OpenAccess (many of these will be rolled out at DAC):
• Special event at DAC 2012: “OPENACCESS @ TEN”, includes free lunch (Monday 12;15-3pm)
• Si2 Annual Open Reception at DAC 2012: A Celebration of OpenAcess (Monday 4:30-6pm)
• Live / recorded reflections on how OpenAccess came to be, from some of the original visionary leaders that made it possible
• Testimonials on OpenAccess’ current value, spanning the semiconductor supply chain
• Special recognition awards given to some of the key technical leaders over the years – unsung heroes that deserve our thanks
• “Ten Years of OpenAccess” giveaways, such as polo shirts, notepads, pens…
• Media articles covering new (or lesser-known) capabilities of OpenAccess you may find useful
To learn more about the DAC events, just click on:, or click on to register.
All of us at Si2 want to share with you our excitement about this very special milestone in EDA. So, let’s all celebrate and have some fun!

Free counter and web stats

Steve SchulzThe Synopsys acquisition of Magma has generated quite a bit of commentary lately, from how it will impact customer choice in tools, to pricing, to what it will do to EDA innovation, and more. The merger itself shouldn’t really be particularly surprising, and in general these waves and troughs are a normal part of the business cycle adapting to a dynamic market environment. So whether it’s Cadence in a hostile takeover bid, or Carl Icahn loading up on shares of Mentor in expectation of a sell-a-thon, or Synopsys acquiring its longtime adversary, some form of continual change ought to be expected at any point in time. This includes the inherent risk to stability for all those customers and partners who have a dependency that could suddenly go away.

Which brings me to my main point. Far too often, standards within the EDA world are oversimplified and mis-characterized as fairly small – in effort and in impact – with modest technical goals to align some file format syntax and reduce an engineer’s inconvenience as he or she moves through the design flow. However, a significant merger can serve as a wake-up call and reminder of the critical business importance that standards can play in managing risk. Typically, the larger and more complex the standard, the larger the risk of not having an open, effective standard in place.

So, who benefits and who loses with more interoperability? Well, it’s pretty clear that all customers will benefit, because it provides greater choice in tools, including alternatives that may suddenly be necessary when your favorite tool is discontinued. In addition, the customer’s design data stored in that vendor’s tool format over years of design cycles has a far easier means to be leveraged into new flows painlessly if open formats with documented semantics have been the foundation for their technical data — the customer’s IP.

Similarly, partners who had depended upon proprietary formats with obscure semantics could find themselves much more at risk than if they had utilized interoperable formats that have no single point of control or failure.

Some might think that this means that the EDA vendors wrestling for control would somehow not benefit as much, however it turns out that the acquiring vendor also wins with open standards. For as soon as the deal closes, they will suddenly have the necessity to either integrate the tools from across both companies into something that they can sell and support, or quickly offer a reliable path for their inherited customer’s design data to migrate smoothly into their own anointed tools before lawsuits get filed.

This task is almost always harder in reality than it seems when the merger is initially conceived. First of all, a good portion of the top employees that created the “losing” sides’ internal formats and data structures may have long left, or if still around will leave right after the merger – leaving the acquirer to figure out all those ugly details by themselves. This is “lossy” by definition, and without a broader community to draw from, can be error prone and can take a very long time to get it right. In one case in point, the adoption of OpenAccess at Cadence was reportedly seen to have a 20x or greater ROI because it would streamline re-engineering work after a merger, and would help in faster market-winning solutions. Remember: time is money, especially in the fast-moving semiconductor marketplace.

So, while there is always a place for proprietary formats and APIs in a changing EDA world, the majority of costly and risky “pain points” associated with data exchange and tool integration in mergers are best mitigated with a thoughtful and strategic business commitment to open standards in advance.

Free counter and web stats

Steve SchulzDeveloping industry standards can be tough business. Getting them successfully, broadly adopted can be even harder. To re-purpose a very old phrase for EDA standards, “Many are called, but few are chosen”. To be sure, sometimes a few engineers can hit the bullseye defining a standard for the entire industry in their first try. Or, every so often an existing proprietary file format or language might get donated, stamped “approved”, and be welcomed by all competitors without a fuss. However, one well-respected IEEE standards leader estimated that 3/4 of all standards fail to achieve broad adoption after being approved. That seems like a lot of wasted effort. I get approached from time to time by Board members at other electronics industry consortia who want to know how Si2 measures effectiveness and performance. So, I’ll use today’s blog to share that information with all of you.

Si2 uses a set of annual metrics to assess performance and reward success. When I came on board in 2002, I setup a structured approach using a spreadsheet of priority-weighted categories and items that all multiply and add up to 100%. There are four top-level categories of metrics:
1. Adoption
2. Relevance and Influence
3. Engineering Execution
4. Fiscal Health
The largest priority weight is placed on adoption, which is the point when the ROI from the members’ investment finally gets realized. Relevance is about focusing on the most needed problems, and at the right time. Influence is about having the membership and marketing clout required to successfully drive the needed changes to succeed in our mission. Engineering execution ensures that the working groups are properly managed, and that delivery dates are met for the standards and any adoption aids (including software, libraries, test cases, and training). Si2 also performs an annual engineering satisfaction survey in which the results serve as a useful metric. Fiscal health keeps a check on our retained equity (“fund balance”), but also tracks forward-looking cash flow headroom. Adoption can be the trickiest to measure, especially when most efforts span multiple years, but there are indirect leading indicators that can serve as reasonable proxies even when full commercial adoption is at a later phase.

Within each category, several specific metric items are defined along with a priority weighting. We create three success scenarios for each item. The main goal counts for 100%, while partial success counts 50%, and achieving a challenging stretch goal counts for 125%.

These metrics are jointly defined at the beginning of each year with input from my staff and our members, then revised and approved by Si2’s Board of Directors. During the year, I review our progress monthly with my team during staff meetings, and we discuss ways to “unblock” barriers for the more problematic metric items. Each January, the prior year’s achieved results are then reviewed by the Board, and a final Organizational Metrics % figure is approved. That number directly affects the variable pay component for everyone at Si2, providing a strong incentive to succeed in all four categories. The higher the level, the more the variable component plays in the compensation. This is how it should be, to keep everyone focused on success toward our mission and priorities.

Having such structured, schedule-oriented metrics for developing and delivering standards may seem out of place at first, where we often start with a less-than-complete vision for an industry solution, use primarily volunteer resources, navigate through complex technical, business, and market dependencies, and all within a small non-profit organization. However, while Si2 may not be part of the for-profit, competitive products business, our members are. So, we must find creative ways to deliver and satisfy their real-world needs, within the schedule requirements they set for us.

It seems to me that a structured approach to assessing performance and adoption metrics ought to be a part of the Board-level accountability process for every standards development organization as a best practice. While no metrics are perfect, I can assure you that this system is highly effective at identifying where change is needed, and quickly motivating creative solutions.

Free counter and web stats

Steve SchulzMany readers may be unaware of a new standard coming out of Si2’s DFM Coalition early next year — one that I have yet to mention before now, but one that has significant potential benefits across the supply chain. So, I will use this blog to introduce OPEX — the Open Parameters for EXtraction. Someone suggested OPEX could also be an acronym for ‘OPerational EXcellence’, and here’s why.

OPEX was created to address the growing risk that has already cost the industry in terms of failing chips, with multi-million dollar price tags, and even worse a significant impact on tape out schedules due to re-spins. The root problems include:

1. the inability to confirm the validity of process parameter changes and ECOs across disparate design teams scattered around the world, across varying design flows using different tools;
2. the inability to confirm the validity of derived process parameter data against the master source data;
3. a lack of synchronization to changes in EDA vendor parasitic formats that are required to keep up with advancing foundry processes, making the mapping between the formats difficult and error prone (these commercial format changes may occur several times each year);
4. no existing semantic standard for naming conventions, ranges and units, or for process parameters or inter-parameter relationships, leading to more confusion;
5. errors existing in translation among parasitic formats and difficulty in managing the maintenance of various translators (which are often not lossless)

To manage these growing risks, companies have resorted to using multiple experts to review details of how to map process parameter names, units, ranges, and relationships, and maintaining wasteful conversion utilities and regression suites, and also hire consultants and perform business process audits to find, repair, and improve change procedures involved with process parameters after costly chip failures. It’s not a pretty picture.

The DFMC member companies have defined a solution to this problem with OPEX:

1. “Input Once, Use Many” — the master “Golden Source” process parameter data must be open, unambiguous, and comprehensive, easy to encrypt, track, control, distribute, and access across both commercial tools and in-house utilities, and easy to access / use over the internet (“in the cloud”).
2. “Golden Structure” for all process parameter data — based on XML with XSD structured templates, it can auto-verify compliance to naming, units, ranges, and relationships upon data entry or conversion.
3. Provably-Correct Export to SQL, UML, Excel — OPEX is more than a “file format”, it is an open XML/XSD database schema complete with verification routines and format translation support through integration with XML editors, SQLite databases, multiple scripting languages, and MS-Excel.
4. Data Visualization — Using 2D and 3D graphing features of Excel, process parameter data can be easily viewed to find corner cases that design teams wish to avoid that may affect performance or yield.
5. Ensure Interoperability — OPEX has been created to be fully compatible and bi-directionally lossless, based upon the generous contributions of ITF, ICT, and MIPT from EDA leaders Synopsys, Cadence, and Mentor Graphics, respectively. OPEX does not compete with any existing parasitic format, it complements them with an open, standard database and enhances their value with design teams. This is analogous with how the LEF/DEF format complements, but does not compete with, the OpenAccess database.

OPEX has already been used in its “pre-release” state by several DFMC members on production chips because it is so complementary, and has been verified as lossless alongside vendor formats. DFMC members are also working with the Synopsys IMTAB, which advises on changes to the ITF format, to keep everything in-sync. OPEX is really an open XML / XSD standard, with data populated and manipulated using SQLite, scripts, MS-Excel analysis, XML editors, and import/export with commercial EDA formats.

Solutions like OPEX have great practical value because they truly represent a huge potential for cost / time savings, and are entirely complementary with current practice. If you are interested in learning more about OPEX and how you can start putting it to use, please contact Si2 — or one of the OPEX WG members.

Free counter and web stats

Steve SchulzChip design teams are all pretty well trained by now to equate “IP” with a block of design content, either “soft” or “hard”, licensed from some external supplier, leveraged to improve design reuse and time-to-market. However, this blog isn’t about that. It’s about managing the “other” kind of IP… actually the original, broader legal interpretation of “intellectual property”, as in patented works. Specifically, I’m talking about the role of managing patents in the world of standards. It’s not what you worry about every day, but that’s why I’m writing this — to share a few core principles you really ought to understand if you develop, contribute, or even use standards in your work.

In EDA, we used to create simple “file format” standards with little regard to patents – after all, what could possibly be patentable in a file filled with “Parameter == <value>” statements? Yet over time, much more sophisticated standards would emerge, closely correlating to much more valuable concepts and clever implementations as we kept advancing toward 20nm silicon techniques. Furthermore, many standards today routinely include software source code IP. These risks became real with consortium-related lawsuits over RAMBUS and SCO Unix, and legal departments now had to worry about the origins of contributed technology, the risks of adopting it, and the risks of sharing what they had if it related too closely to patented techniques.

For standards bodies, there are two broad categories of risk I’ll discuss:
1. The legal risk to a participating company’s own IP portfolio, and
2. The financial risk to using a standard that may contain hidden, latent licensing surprises.
The main idea behind the first risk, protecting the member’s IP portfolio, is to ensure that contributions of IP are always voluntary and explicit. There should be clear, written procedures for offering IP to (or excluding IP from) the group developing the standard. You should make sure that the standards group’s processes ensure that typical technical discussions or slide presentations among peers do not trigger an involuntary “contribution” of IP.

The central idea to avoiding nasty royalty surprises after adopting a standard, is to ensure a transparent, legally binding development process. This means that if any member of the group developing the standard has direct knowledge of IP in their company that might be necessary to adopt the standard, they must either exclude that IP in a certificate attached to the draft specification before voting, or be be legally committed to offer “RAND” licensing terms to all requesters, if the issue arises later. After the standard is published, a “reciprocal RAND licensing” clause then adds growing safety as other companies adopt the standard more broadly across industry. Essentially, each company accepting the license terms for that standard agrees to offer similar RAND terms to all others who developed the standard and/or accepted the license.

Si2 worked this all out 5-6 years ago with a thorough “IP Policy” that implements the above ideas. If you hear about a “60 day exclusionary period”, that just means that the standard is considered done, but is not yet published so any member has time to review and submit a RAND License certificate or Exclusion Certificate on that specification (if they so choose). The resulting standard can then be adopted as safely as possible, across industry, for years and years to come.

Steve SchulzIn case you missed it, GLOBALFOUNDRIES last week announced at Si2Con that they are contributing the full set of DRC+ data structures to Si2, to be integrated into Si2’s OpenDFM standard, which is developed and maintained by the DFM Coalition (DFMC). You can read the press release here:

I actually have little need to explain the technical or business benefits of DRC+ pattern-matching technology. Fellow (professional) blogger Richard Goering has already done a truly fine job of that (see: Instead, I’ll add my own perspectives on what this donation means in DFM Verification to chip designers and EDA / foundry partners.

First of all, DRC+ utilizes innovative (and award-winning) pattern-matching technology, it can run highly accurate design checks by orders of hundreds to thousands of times faster than typical model-based simulation, which saves valuable company time and resources near chip tapeout. Second, it can improve designs by going beyond “pass/fail” to identify yield-detracting patterns and recommending more robust ones. Third, EDA tools can flag patterns for automatic yield improvement, making it useful for chip design teams before tapeout. Of course, the classes of problematic patterns must be identified first – and Cadence supported the development of DRC+ with a pattern classification tool to do just that.

What is so important about this announcement for the industry at large? To begin with, GLOBALFOUNDRIES’ generous decision to contribute this new, leading-edge innovation to the DFMC means that the entire industry ecosystem can benefit from DRC+ as an open industry standard. Not only will it now be an open standard, but it will be deeply integrated into the larger OpenDFM standard. All of the benefits of OpenDFM and all the benefits of DRC+ will come together, which is how the industry wants to see it.

DRC+ and OpenDFM are dynamic, living technologies, just as our design methodologies and process technologies are dynamic — constantly evolving with time and experience. So it is critical that the standard be managed as a truely collaborative effort, where no one company’s interest can stall or override the interest of other companies. It is also important that there be sufficient resources to evolve the standard rapidly according to industry’s changing needs (OpenDFM just had two releases in less than 1 year).

These new technologies are not sufficient as standards specifications alone: what makes them valuable is customer adoption support. This is the “real” stuff you need – such as XSD/XML DRC+ patterns, OpenDFM parser, design test cases, sample libraries, training / tutorials, etc. The DFMC is well-equipped to handle those needs, and do so in a strictly non-discriminatory fashion.

Overall, the ultimate value of the DRC+ donation by GLOBALFOUNDRIES means that the entire industry will enhance it’s capability by virtue of working together. That is the true essence of the title of this blog (“Collaborative Advantage”), and also Si2’s tagline: “Innovation Through Collaboration”. Please consider participating along with these other leaders of industry, to help make it even better.

DRC+ is a technology that happens only once or twice in a decade. It compresses days into hours and hours into minutes to improve yield without sacrificing accuracy. At Si2, we are proud to guide its development and provide its benefits to the semiconductor industry.

Steve Schulz
In case you didn’t already know, the 16th Si2-hosted conference highlighting industry progress in design flow interoperability comes to Silicon Valley (Santa Clara, CA) on October 20th. Si2Con will showcase recent progress of our members in the critical areas of:
1. Design tool flow integration (OpenAccess);
2. DRC / DFM / Parasitics interoperability (OpenDFM and OPEX);
3. Low power design (CPF. low power modeling, and CPF/UPF interoperability); and
4. Interoperable Process Design Kits (OpenPDK)
Si2 is also ramping up a brand new effort to define standards for 3D / 2.5D design of stacked die (Open3D), and this event will be an excellent opportunity to meet with Si2 and members who will be present to find out more about it. To learn more about Si2Con and register for this exciting event, go to:

While I could next delve into the keynote talks and session contents, I’d rather use this opportunity to share our thinking on the event’s change in name and other behind-the-scenes aspects. Let’s begin with the name change!


Back in the early 2000’s, Si2 began hosting workshops around the new OpenAccess vision and technology, with the goal of helping advance OpenAccess adoption through sharing of requirements, experiences, technical knowledge, and broadening the interest and participation in it’s guidance and development as a true community effort. These workshops grew into the “OpenAccess Conference”, which was successful – so much so that we even doubled them to twice per year during the helter-skelter years of rapid changes and initial adoptions around the industry.

Once Si2 proved to be good stewards of OpenAccess (and LEF/DEF), our members brought us into an emerging area of need with the DFM Coalition, which also was covered in a parallel track at the OpenAccess Conference. A similar pattern repeated itself with the Open Modeling Coalition, Liberty TAB, and Low Power Coalition as well. By the time we began covering progress from the LPC in 2006, we began expanding the name slightly to “OpenAccess+ Conference”. Last year, we hinted at the broader scope of coverage with the name “Si2/OpenAccess+ Conference”, to get industry used to associating the long-familiar “OA Conference” with this broader name. All of these were interim, incremental transitions in brand management toward the final “Si2 Conference” name to reflect the full scope of coverage.

As press time drew near, we started thinking about keeping the event name short and simple, one that might lend itself to a logo. Si2 is definitely an engineering-centric organization, so logos were never very important in those early years, but as Si2 expands it’s breadth of membership across the supply chain, it becomes more important to establish a visual identity. Hence, we have begun this process by dubbing the event simply “Si2Con”, and creating a basic logo to match. We are interested in more creative name and logo ideas – we are even planning a “conference naming contest” ahead (so Watch This Space).

Why does Si2 pull together this annual conference? It’s very simple: our non-profit mission is to promote interoperability, improve efficiency, and reduce costs through enabling standards technologies – and these technologies are only valuable to the extent they are broadly adopted and used. That means bringing the right experts together on a topic to share technical challenges, educate industry peers on these new solutions, and share adoption experiences. Enthusiastic attendees tell us that one of the main benefits is networking with same-domain technical peers, listening to a wide variety of presenters that take a “flow” perspective as they do, and ability to see live demos of interoperability progress in action.

As always, an excellent luncheon will be provided, sponsored by Cadence Design Systems.

Please consider joining us at ‘Si2Con’ and see for yourself!

Steve SchulzLately, there has been increasing discussion in the industry about the need for a set of standards that specifically support interoperable description of intent for analog and custom design, a.k.a. “analog design intent” standards. It is a big, complex, and intriguing topic, with multiple valid points of view representing different aspects of the supply chain. Most concede that it consists of a set of specifications, some being extensions of existing standards, supplemented by several new ones. With the analog and custom IC product landscape increasing with a large consumer / mobile market, and the potential for increased tool automation also of growing interest by EDA vendors, it is hardly surprising that the topic has transitioned to more actionable, tangible calls for action in the standardization space.

Proponents calling for analog design intent standardization point to the rising percentage of effort in handling non-digital design and layout tasks, made worse by technology changes due to advanced process nodes. While digital enjoys formal executable models, automated synthesis, and top-down constraint specifications, the analog world relies on exchange of datasheets, manual topology selection and sizing, and less automated place and route and verification. Some reason that the relative gap in automaton is largely due to the lack of formalism in description of intent and the lack of commonality preventing the various “views” to be exchanged between tools across a design flow. Open standard interfaces could permit far greater exchange of analog intent, with greater formalism and clarity, to greatly improve time-to-market, quality, and efficiency for the industry as a whole.

Those who are less eager for such standardization argue that some of that information is really the result of proprietary IP investments made over many years, and the resulting methodology used with those tools reflects as much the internal product-level details as it does more generic description of intent. The point has also been made that the goal may be more about pricing leverage than about innovation in analog design intent.

Si2, playing such a prominent role in related standards such as OpenAccess, OpenPDK, and OpenDFM, is indeed the logical place for such a conversation, and those efforts cited were in fact created in response to similar requests from our members and industry at large. At this time, Si2 is seeking serious, qualified input from those who are stakeholders in this arena, and we have no formal position either way. As with our past efforts, Si2’s response will be rooted in careful analysis of the breadth of industry need, the potential for widespread benefit and ROI to outweigh likely cost / effort, ability to leverage existing standards and technologies, legal constraints, and the expectation of active participation and contributions to help the adoption of the investment succeed. With five major active efforts at Si2, including the ramp-up of Open3D taking place as we speak, we are not looking for more work! However, we do have the ability to scale, only if needed, to address important challenges within Si2’s scope.

So, please share your thoughts with us on this topic. Si2 will listen carefully — to our members, but also the broadest possible audience, the “engineering community” represented by readers of this blog. Feel free also to add comments below for more general feedback. Or better still, come see what’s going on at the 16th Si2 Conference on October 20 in Santa Clara:
Thank you!

Steve SchulzSummer is a time for relaxing and recharging, at least for a bit. I took some time off (despite numerous comments about not updating my blog) to enjoy time off with family and good friends. This included an exciting trip, with a week of white-water rafting 87 miles through the Grand Canyon. One look above us at the nearly mile-high side walls of tiered rock and sandstone layers, as seen from the depths of the Colorado River below, gave new meaning to silicon stacking and vertical trenches!

We put in at Lees Ferry (near Marble Canyon, AZ) in a 4-passenger oar-driven raft and supplies for 6 days of food. Drinking water is generated with a solar panel to charge a battery that runs the 50 degree river water through several filters. The rapids ranged from class 3 up to class 5, and the rapidly melting ice water from Lake Powell moved the water at a brisk 26,000 cubic feet / second. We stopped for special sightseeing hikes several times per day, and camped under the stars. One time we had to swim through icy water, then algae-laden stagnant pools, climbing up a rope 30 feet, then wading through more muck to reach “The Silver Grotto”. We hiked up and down the mountain side to reach a “sideways waterfall” where we could swim and relax. Another time we hiked to the Little Colorado River, with surreal bright turquoise blue water (the product of calcium carbonate and copper sulfate minerals in the water); we were allowed to jump in to be carried away through its rapids.

Food was surprisingly good, as our guides prepared everything from steaks to cakes using LP gas. As we traversed downstream, the water’s altitude kept dropping at the same time as the walls of the Grand Canyon stretched up ever higher, until we reached our destination at Grand Canyon Village 6 days later. At that point, the water-to-peak height approaches one mile. After hiking our duffel bags to the mules at Phantom Ranch, we then began the 7.8-mile vertical hike to the top about 7:15am, reaching the top around 12:45 in the 100-degree AZ heat. Quite an adventure!

Along the way, I was amazed at the variety of rock formations clearly visible in the (mostly) horizontal layers dating back 1.8 billion years. Sometimes, violent forces from heat build up would literally melt and bend the rock layers from horizontal into a nearly vertical orientation. We sure wouldn’t want that to happen to stacked silicon dies, would we?
Grand CanyonAll Grand Canyon metaphors aside, our industry is indeed preparing to “go vertical” with mainstream production capability of 2.5 and 3D stacked die utilizing Through-Silicon-Vias (TSVs), and part of that is setting up the common infrastructure required, including design data standards. After a successful kickoff meeting at DAC last month, industry experts are now joining the “Open3D” TAB under the auspices of Si2. Every part of the semiconductor and EDA / IP supply chain was represented at the DAC meeting, and there are some pretty aggressive schedules including 1H of 2012. If you are interested in participating in Open3D, please contact Si2 for more details. Si2 is coordinating its Open3D activities with other key consortia, including GSA, SEMATECH, SRC, IMEC, and LETI.

I’ll follow up with more meat around Open3D in my next blog… but in the meantime, don’t forget to enjoy your summer!