Steve SchulzDeveloping industry standards can be tough business. Getting them successfully, broadly adopted can be even harder. To re-purpose a very old phrase for EDA standards, “Many are called, but few are chosen”. To be sure, sometimes a few engineers can hit the bullseye defining a standard for the entire industry in their first try. Or, every so often an existing proprietary file format or language might get donated, stamped “approved”, and be welcomed by all competitors without a fuss. However, one well-respected IEEE standards leader estimated that 3/4 of all standards fail to achieve broad adoption after being approved. That seems like a lot of wasted effort. I get approached from time to time by Board members at other electronics industry consortia who want to know how Si2 measures effectiveness and performance. So, I’ll use today’s blog to share that information with all of you.

Si2 uses a set of annual metrics to assess performance and reward success. When I came on board in 2002, I setup a structured approach using a spreadsheet of priority-weighted categories and items that all multiply and add up to 100%. There are four top-level categories of metrics:
1. Adoption
2. Relevance and Influence
3. Engineering Execution
4. Fiscal Health
The largest priority weight is placed on adoption, which is the point when the ROI from the members’ investment finally gets realized. Relevance is about focusing on the most needed problems, and at the right time. Influence is about having the membership and marketing clout required to successfully drive the needed changes to succeed in our mission. Engineering execution ensures that the working groups are properly managed, and that delivery dates are met for the standards and any adoption aids (including software, libraries, test cases, and training). Si2 also performs an annual engineering satisfaction survey in which the results serve as a useful metric. Fiscal health keeps a check on our retained equity (“fund balance”), but also tracks forward-looking cash flow headroom. Adoption can be the trickiest to measure, especially when most efforts span multiple years, but there are indirect leading indicators that can serve as reasonable proxies even when full commercial adoption is at a later phase.

Within each category, several specific metric items are defined along with a priority weighting. We create three success scenarios for each item. The main goal counts for 100%, while partial success counts 50%, and achieving a challenging stretch goal counts for 125%.

These metrics are jointly defined at the beginning of each year with input from my staff and our members, then revised and approved by Si2’s Board of Directors. During the year, I review our progress monthly with my team during staff meetings, and we discuss ways to “unblock” barriers for the more problematic metric items. Each January, the prior year’s achieved results are then reviewed by the Board, and a final Organizational Metrics % figure is approved. That number directly affects the variable pay component for everyone at Si2, providing a strong incentive to succeed in all four categories. The higher the level, the more the variable component plays in the compensation. This is how it should be, to keep everyone focused on success toward our mission and priorities.

Having such structured, schedule-oriented metrics for developing and delivering standards may seem out of place at first, where we often start with a less-than-complete vision for an industry solution, use primarily volunteer resources, navigate through complex technical, business, and market dependencies, and all within a small non-profit organization. However, while Si2 may not be part of the for-profit, competitive products business, our members are. So, we must find creative ways to deliver and satisfy their real-world needs, within the schedule requirements they set for us.

It seems to me that a structured approach to assessing performance and adoption metrics ought to be a part of the Board-level accountability process for every standards development organization as a best practice. While no metrics are perfect, I can assure you that this system is highly effective at identifying where change is needed, and quickly motivating creative solutions.



Free counter and web stats