Part of the  

Chip Design Magazine


About  |  Contact

Posts Tagged ‘Mentor Graphics’

Next Page »

Blog Review Mon. July 21, 2014

Monday, July 21st, 2014

Plane talking at the Farnborough Air Show from Dassault Systemes and Synopsys goes vertical; Forte Q&A; New views at ARM; Interface integration at Mentor Graphics. By Caroline Hayes, Senior Editor.

Inspired by the vertical takeoff of the iconic Harrier jump jet, Michael Posner, looks at the vertical HAPS daughter board launched by Synopsys. He lists the real estate and connector benefits of thinking “laterally”.

When the acquisition of Forte by Cadence was announced, there were many questions asked and Richard Goering, Synopsys asks them of former CEO, Sean Dart. The current senior group director for R&D, Cadence System-Level Design Group has some interesting insights into high level synthesis and the advantages of being acquired by Cadence.

Dassault Systemes is flying high, with its own chalet at this year’s Farnborough International Airshow in the UK. Amongst the aircraft, the company had a dedicated meeting spaces and a 3DEXPERIENCE Playground, reports Aurelien. There are some great images on this blog, amongst the news items.

New boy on the ARM block, Varun Sarwal, dives straight in the blogsphere with news of the Juno 64bit development platform. He explains the architecture well, covering hardware and software overviews.

Welcoming a guest blogger at Mentor Graphics, Phil Brumby, examines user interface and relates how the company has finished integrating its own RTOS Nucleus and development tools with Embedded Wizard UI tool from TARA Systems for MCU or MPU-based hardware systems, such as the glucose meter illustrated on the blog.

Deeper Dive – A 3D-IC round table – part II

Thursday, July 17th, 2014

What’s needed for 3D-ICs to flourish? asks Caroline Hayes, senior editor. Experts from Mentor Graphics, Altera and Synopsys have some ideas for future progress.

The round table is made up of John Ferguson (JF), Director of Marketing, Calibre DRC Applications Design to Silicon Division, Mentor Graphics; John Park (JP), Business Development Manager and Methodology Architect, System Design Division, Mentor Graphics, Steve Pateras (SP), Product Marketing Director, Silicon Test Products Mentor Graphics, Michael White (MW), Director of Product Marketing Calibre Physical Verification Mentor Graphics, Arif Rahman (AR), Program Director and Architect, Altera Corporation, and Marco Casale-Rossi (MCR), Senior Staff Product Marketing Manager, Design Group, Synopsys.

What is the scale of 3D IC adoption/use, for example in 20 or 14nm. What can accelerate adoption?
[JF:] I am just starting to see some 20nm 3D designs. Most 3D designs I’ve seen are focused on more mature nodes: 45, 65, etc.
[JP:] 3D-IC is still at the R&D stage in both academia and the semiconductor industry, the exception being the memory providers. At the 20/14nm nodes, scaling Analog/RF blocks may not be practical, so being able to partition the device into multiple chips across a silicon interposer starts to make a lot of sense. However, issues such as Known Good Dies, assembly and supply chain challenges, including IP’s need to be overcome.
[AR] So far, we see very limited deployment of 3D IC in 20nm and 14nm technologies. Memory vendors appear to be taking the lead to deploy 3D IC technology in 20-ish nm process technology to create high bandwidth and high-capacity memory. JEDEC HBM and Micron’s HMC are two of the early adopters.
[MCR] Memory and FPGA makers are the early adopters. It enables design-level integration for digital + analogue & mixed-signal; and is a good option for ‘More than Moore’ integration which includes integration of electronics + microelectro-mechanics + photonics

What has to come first – availability of the 3D IC process, so that EDA tools can be developed, or tool flows so that foundries can produce 3D ICs economically?
[JF:] It is a chicken and egg situation. EDA doesn’t want to invest in the tools until they know there is a real design market, which implies a foundry process. Foundries don’t want to invest in a process unless there is a design market and EDA tools. Designers don’t want to take the risk without a fully supported flow from a foundry, implying EDA tools in place. The initial work has been done by those market niches which can immediately benefit. They take on the risk and have been working to push those EDA and foundry providers that are willing to share the risk. Issues identified get fixed. Successes get published and begin to establish legitimacy.
[JP:] This will require collaboration between both sides with tools and flows being developed simultaneously.
[SP:] 3D DFT related tools are essentially independent of the 3D IC process and can therefore be developed independently. Mentor already offers a commercial 3D memory BIST solution and has also developed a comprehensive 3D logic test solution that is already part of TSMC’s 3D reference flow.
[AR] EDA tool readiness is not a real show stopper to 3D technology adoption. We believe existing tools and design methodologies can be extended easily to enable 3D IC design.
[MCR] Synopsys collaborated with the foundries to enable development and adoption of 3D-IC processes. It’s not a question of one waiting for the other, but working together to make 3D-IC a reality. Synopsys has made several announcements with TSMC and other partners regarding our 3D-IC collaboration.

What standards would you like to see in place, to regulate interconnect, test for example, for 3D IC?
[JF:] This is a tricky space. Some come at it from an IC perspective, while others come at it from a packaging or board level perspective. Each has its own set of internal standards, but there is really nothing that easily crosses them all. If we focus each tool/standard in the right domain (i.e. design the IC’s with IC based tools, design boards with board tools, etc.) then we just need to make sure that all those components come together correctly. What format will be used for that will largely depend on the process being implemented to combine it all together.
[SP:] The most critical aspect of testing devices within a 3D stack is gaining access to the devices in the middle of the stack. This requires a test access architecture that is shared among all devices in the stack. Since devices will likely originate from different vendors it becomes necessary therefore to have in place an industry standard test access architecture that can adopted by all device developers. The IEEE 1838 working group is currently developing this needed standard.
[MCW:] Yes, the movement to full 3D with >2 stacked dies does bring along the need to deal with thermal, stress, etc. to a far greater degree than is required with 2.5D. For some customers and applications, 2.5D will be an easier and more attractive alternative to vertical stacking for some time. One particular thing that would make 2.5D or other side by side approaches like wafer fan out more interesting is reducing the cost of the medium being used at the interposer. Multiple foundry ecosystems and packaging houses are exploring lower cost solutions to make 2.5D cost effective to more applications.
[MCR] Synopsys is playing a critical role in defining and driving relevant standards. A Principal Engineer from Synopsys is the vice-chair for P1838 working group in IEEE. The charter of this group is to define standardised test access for 3D-IC.
[AR] Test/KGD methodology and interface standard for chip-to-chip communications.
Different types of devices in a 3D stack need to communicate with each other and these devices are expected to be procured from different companies and will be fabricated in different process nodes. There is a need for common low-power and scalable (ability to scale to higher bandwidth) interface standard to enable heterogeneous integration of 3D systems. This interface standard should define physical, electrical, logical, and test requirements.

What opportunities do you see 3D IC will bring for designs in the next five to 10 years (e.g applied to lower geometries; opportunities for new applications, e.g. mobile healthcare use)?
[JF:] Silicon photonics is gaining interest right now for its promise of very low power for use in large scale process compute farms (i.e. cloud computing). Unlike IC design, photonics does not benefit from a shrink. The geometric widths are dictated by the wave lengths of light being used, which are typically much greater than the smallest features available in today’s nanometer processes. So, there is a lot of speculation that designers who wish to combine photonics with advanced IC electronics will do so in a 2.5D or 3D infrastructure, for example adding photonic circuitry into interposers at larger process nodes, while keeping the electronics on advanced processes stacked on the interposer itself.
[MCR] The opportunities are amazing: If we are able to assemble parts from different sources, manufactured using disparate technology nodes, adding MEMS and silicon photonics into the picture, it would lead to a revolution in automotive, computing, healthcare, mobile, networking. For example, a component small enough could be injected in the human body to test something (MEMS sensors), compute (CPU/MCU), transmit (RF) and/or store (FLASH) the results onboard, and dispense (MEMS actuators) the appropriate amount of a medication…the imagination is the only limit.
[AR] Extreme miniaturization of a complete system, which includes sensing, processing, and communication functions, into a very small form factor may require the use of 3D IC technologies. Such systems can facilitate new applications in healthcare (wearable electronics/personalized monitoring, implanted drug delivery devices), environmental monitoring, and imaging.
2.5D/3D integration of optical and electrical components can enable broader deployment of integrated optical interconnect technologies for short-reach communication.

Blog Review – Mon. July 14 2014

Monday, July 14th, 2014

Accellera prepares UVM; Shades of OpenGL ES; Healthy heart in 3D; Webinar for SoC-IoT; Smart watch tear-down. By Caroline Hayes, Senior Editor.

An informative update of Universal Verification (UVM) 1.2 is set out by Dennis Brophy, Mentor Graphics on the announcement by Accellera of the update. Ahead of the final review process, which will end October 31, the author sets out what the standard may mean for current and future projects.

The addition of Compute Shaders to OpenGL ES for mobile API is one of the most notable, says Tim Hartley, ARM. He explains what these APIs do and where to use them for maximum effectiveness.

Dassault Systemes was part of The Loving Heart project, producing a video with the BBC, to advertise the world’s first, realistic 3D simulation model of a human heart, developed by Simulia software. The blog, by Alyssa, adds some background and context to how it can be used.

A webinar on Tuesday July 22, covering the SoC verification challenges in the IoT will be hosted by ARM and Cadence. Brian Fuller flags up why presenters in ‘SoC Verification Challenges in the IoT Age’ will help those migrating from 8- and 16bit systems, with a focus on using an ARM Cortex-M0 processor.

Inside the Galaxy Gear, the truly wearable smart watch, is an ARM Cortex-M4 powered STMicroelectronics device. Chris Ciufo cannot pretend to be taken off-guard by the ABI Research teardown.

Deeper Dive – 3D-IC Part 1 Fri. July 11 2014

Friday, July 11th, 2014

A 3D-IC round table – part I

As the industry transitions from 2.5D to 3D-ICs, Caroline Hayes, senior editor, asked experts from Mentor Graphics, Altera and Synopsys for their view on what system designers need to consider in implementing 3D ICs.

The round table is made up of John Ferguson (JF), Director of Marketing, Calibre DRC Applications Design to Silicon Division, Mentor Graphics; John Park (JP), Business Development Manager and Methodology Architect, System Design Division, Mentor Graphics, Steve Pateras (SP), Product Marketing Director, Silicon Test Products Mentor Graphics, Michael White (MW), Director of Product Marketing Calibre Physical Verification Mentor Graphics, Arif Rahman (AR), Program Director and Architect, Altera Corporation, and Marco Casale-Rossi (MCR), Senior Staff Product Marketing Manager, Design Group, Synopsys.

The advantages of 3D-IC are obvious, but there are challenges (e.g. thermal management). What are these in your view, and how does your company address these?
[JF:] There are many. The biggest issue towards large-scale adoption, at least as I see it, is trust/risk. When you do an SoC today, there is an explicit agreement in place with the foundry, and with that foundry’s blessed IP partners, that if you follow their design rules, you get reasonable manufacturing yield. With 3D, we’re talking about components from multiple processes, potentially from different foundries. When you put them all together, there is no longer anyone guaranteeing your success. It is your skin on the line. As such, it is a very risky proposition. It won’t get huge traction until/unless more designers take the risk and not only show successful design, but demonstrate that by taking such risk, they are able to implement a design that is significantly better than alternative technologies. If the value proposition is only pricing, it is a tough sell. You can always find other places to cut costs and corners in an SoC without the same level of risk.
[JF:] For technical challenges, at least for verification and extraction perspective, the biggest issue is that when you stack dies, you can no longer imply vertical alignment and physical connection by GDSII layer numbers. You have to distinguish each die placement to keep these layers independent. No small task.
[JP:] A big challenge is reliability, which includes thermal stress management. Mentor’s FloTHERM product coupled with stress analysis capabilities will play a critical role in this space. Path finding tools that allow design teams to make quick trade-offs against different packaging technologies to optimize cost, form factor and performance are also required. Mentor’s newly announced Xpedition(tm) Path Finder can play a role here.
[MW] There are both technical and business challenges. Technical challenges include thermal management is clearly one that early adopters are concerned about. Chip/package/TSV stress and the impact of this new packaging on electrical performance is another area of concern. Obviously physical verification (e.g. DRC, LVS) of the assembly and parasitic extraction of key nets are also a challenge with this new packaging approach. Mentor has put in place solutions for all these aspects. Most are already in production use for 2.5D or 3D assemblies.
On the business side, the industry has seen adoption of 3D for select IC segments, (e.g. image sensors, memory) where there are truly compelling performance reasons to do so. Other IC applications are still in their infancy determining for what subset of their IC designs is there a compelling reason to move towards 2.5D/3D-IC.
[AR] Thermal management, creation of high aspect ratio high density TSV, and test/KGD are some of the challenges for 3D integration. We work closely with our manufacturing partners, research institutes, and universities on research and development projects targeted to these areas .
[MCR] 3D-IC is currently in volume production at memory makers: their vertical integration and, from a technical viewpoint, the relatively low numbers of TSV and their patterning regularity make it much easier to implement memories in this process. There are still challenges with respect to extraction, physical verification and place and route (P&R), etc. To address these challenges Synopsys has developed specialized P&R and parasitic extraction capabilities to account for TSV within a die; we have developed a specialized interposer router for RDL; our LVS/DRC supports multi-die and multi-technology checks; we have developed a unique multi-technology circuit simulation capability; our 2.5D- and 3D-IC flows have been certified by silicon foundries such as TSMC, and GlobalFoundries; and we have collaborative research initiatives on-going with IMEC, Applied Materials, and IME (Singapore) and more.

Is there still a place for 2.5D packaging? Where would it be used over 3D-IC?
[JF:] Yes. Consider the Xilinx case. Their issue is that to enable very large designs they are forced to go beyond the full reticle size. They get around this by breaking the FPGA into two dies and placing them side by side on an interposer. It works well in their space because they have very regular designs. I don’t see that approach going away for them.
[JP:] For many applications, 2.5D is the road to full 3D-IC. We would expect for the next decade or more, 2.5D will be the driver and bridge to full 3D-IC. At this point, the reason to proceed with 2.5D over full 3D-IC is primarily thermal and design flexibility. IBM after a long study concluded that for thermal and stress reason they have to proceed with 2.5D.
[AR] Just to be clear we use the term 2.5D to define passive silicon interposer based side by side integration and 3D to define stacking of two or more active device layers with TSV.
We believe both 2.5D- and 3D-IC will co-exist. Thermally limited applications with high-bandwidth chip-to-chip interfaces will most likely prefer 2.5 D integration over 3D integration. Applications requiring integration of multiple heterogeneous die (logic & memory; logic and mixed signal) may deploy 2.5D integration as well.
[MCR] 2.5D-IC is a more evolutionary approach than 3D-IC, and is a viable solution when the footprint is not the primary concern. For example, silicon interposers, manufactured at 65nm, offer one to two orders of magnitude of connectivity compared to TSV and, of course, they are useful for boards.

Part 2 of this round table, where the panel considers means to accelerate adoption of 3D-ICs, will be published July 17.

Deeper Dive – IoT Security

Monday, June 30th, 2014

By Caroline Hayes, Senior Editor

The traditional definition of the Internet of Things (IoT) is sensor and logic-based devices combinations. Now, there is a shift, says AMD’s Steve Kester, where advanced processing capabilities are being incorporated into embedded systems that go well-beyond that traditional model (medical devices, “smart” monitors and signs, home and business appliances, as well as distributed networking devices and dense servers are examples). With these new roles, security will become even more important. Caroline Hayes, Senior Editor, asked Steve Kester, Director of Government Relations, AMD [SK]; Shantnu Sharma, Director of Corporate Strategy, AMD [SS], Rich Rejmaniak, Senior Technical Marketing Engineering, Mentor Embedded Software division [RR], Rob Coombs, Security Marketing Director, Systems & Software, ARM [RC] and Ambuj Kumar, Senior Principal Engineer at Cryptography Research, a division of Rambus [AK].

How are the security risks in an MCU different to those in a connected IoT device?

[RR] The key to the need for security is the reality of the exposure resulting from connectivity itself. While any device can be compromised, non-connected devices require physical access to each and every unit, making security breaches uneconomical for pretty much all potential criminal activity. Note that security was never a problem for the Apple II or Windows 3.x machines. Even if you could gain access, it would only be for one machine at a time.


With the emergence of the IoT, these devices are finding themselves integrated into a communication mesh that provides for massive widespread access. Now the first question that a person would ask is: “What is the economic gain for someone hacking into my IoT to raise the temperature of my refrigerator?” The short answer appears to be; “None.” The truth is that someone who can shut down thousands of refrigerators can blackmail a power company, lest people flee from the option to allow the company to optimize power consumption across their grid.
Access to seemingly worthless data or mischievous control of minor devices can result in popular movement away from the potential business opportunities presented by the IoT. The fact that there young adults sitting in prison because they faked 911 calls to originate at the residences of random people, thereby cause the police to storm their home in the middle of the night, proves that anything that can be hacked will be hacked, even if there is no apparent economic gain.

[SS] Security risks in an MCU differ from—but in some cases may be similar to—security risks in a connected IoT device. In a broad perspective, security risks can be categorized as those related to connectivity, data access, and software.


At a hardware level, IoT devices have connectivity to any device around the world, whereas with an MCU the connectivity is limited to local components. Traditionally, MCUs have data access more along the lines of industrial or factory information; with IoT devices, the data access includes highly personal, end-consumer data. In a software sense, IoT devices are very similar to client/networking devices and are leveraging much of the same software found in traditional PCs.

[SK] The single biggest difference between security risks for MCUs and those for IoT devices are that an MCU’s threat vectors are often localized, whereas oT devices are more like that of a PC or other mobile devices: personal data can be compromised or damage can be inflicted even at a considerable distance from the actual device.
Of course, both have the potential to be exploited to produce considerable harm, which is why it is important to harden these technologies with effective security features.

[RC] An isolated microcontroller could be subject to physical attacks whereas a connected IoT device can also be attacked remotely. Software based remote attacks are more scalable and therefore security (and hardware based roots of trust) needs to be designed-in.

[AK] IoT devices have some very unique security characteristics. Their limited power budget and tight cost constraints greatly influence their architecture and design. The spectrum of IoT use cases is expansive, crossing many different operating environments. Use cases may include deeply embedded modules like the inside of an automotive or a device “in the wild” like a parking meter. The security profiles and requirements depend heavily on the use case, so a technique that works on one product may not be directly portable to another.
Further, IoT application development often has to navigate the challenges associated with a nascent field such as the lack of standardization, fragmented platforms, and a poor understanding of best practices.
The history of security has shown that the attacks and the defense have evolved together. As the architecture became more secure, attacks became more sophisticated and complex too. Today’s attackers have the resources and knowledge produced by their decades of work, meaning that a connected IoT device needs strong security that’s advanced enough to thwart advanced attacks.

What security features can be used to combat these risks in IoT devices?

[RR] The first line of defense is secure communication channels. Encryption and validation can be complex and compute-intensive operations. To be certain of their effectiveness they have to be extensively tested and exercised over a large applied base. This is difficult to do properly at the application level, and it can’t be an option that a vendor will add after getting past the crunch of bringing a product to market. Security must be, and indeed is, built into the protocols and standards in modern IoT proposals. To be successfully implemented, these scrutiny measures must be inherent in the platform execution environment.

[SS] To combat security risks in IoT devices, a combination of network hardening, device hardening, and consumer education/behaviour modification should all be implemented. Advances in the network to which IoT devices will connect are being actively considered by a number of industry players, while device manufacturers are aware that devices will have to be more secure than they are today—for example, appliances and automobile industries will have to account for potential breaches which could impact human life, while end users will have to make educated judgment calls regarding how much personal data they are comfortable sharing. Overall, IoT security is very much a shared responsibility for both producers and users.

[SK] Our view of security for the IoT is that it is a shared responsibility and every element of the IT ecosystem, including users, to address security issues and protect sensitive data and devices.


As cyber security challenges continue to evolve, we can’t point to others to solve the problem. We must take a collaborative approach to secure cyber space.

[RC] Most IoT devices have hardware roots of trust and crypto accelerators to provide the basis of strong authentication and confidentiality (e.g. Transport Layer Security). In addition, one-time-password memory can be useful for keys and “fuses”. If the device is loading third party applications an ARM® TrustZone® based Trusted Execution Environment (TEE) can provide useful protection against software attack and non-invasive hardware attacks to code, data and peripherals.


Some devices will need to offer tamper resistant storage for crypto keys or running attestation and this may be implemented using a secure element. If the device is using a smaller microcontroller with a memory protection unit that can be used to provide some isolation between application code and system or trusted boot code.

[AK] Security cannot be an afterthought. An IoT device needs a solid security foundation to proactively defend itself. A hardware root of trust provides such a foundation, as a properly designed security architecture provides protection against existing, as well as future attacks. This hardware root of trust ensures that device secrets are safe and secure even when an attacker gets control over the software.
Important considerations for IoT security have traditionally included end-to-end encryption, secure key management, strong authentication, side channel resistance, etc. However, today’s device also needs a security infrastructure to secure the device throughout its entire lifecycle, spanning from its earliest form as a little piece of a silicon wafer to a finished product in the field.

What are the best tools to design effective security features in an IoT system? And how are they used?

[RR] For communication security, inherently using SLL at either the socket level itself, or through HTTPS layers, is highly effective at securing access to a device. However this security must be inherent in all communications and cannot be applied to a subset of connections. In addition, using supervised execution through a Hypervisor, memory isolation, or other partitioning allows internal isolation of processes to prevent cross access of data. Such a case would be the separation of credit card processing code modules and data from that of user interface and motor control in a vending machine.

[SK] AMD believes that the strongest security features begin at the processor level – the bedrock of computing – where they can complement and enhance other hardware and software-based security features in a highly-secure manner. The best tools to design effective security features are those based on open standards, such as those AMD uses in our partnership with ARM and their TrustZone technology. This is one reason why AMD is creating a new generation of secure computing capabilities for digital content, data, e-commerce and trusted client-to-cloud interactions. The AMD Platform Security Processor (PSP) is built upon ARM TrustZone technology and architected to protect against malicious access to sensitive data and operations at the hardware level. This architecture is based on open standards and interoperable APIs, and is available now in products from AMD. These products can be then used to build Internet-connected medical devices, commercial kiosks, smart screens, and a host of other IoT products with hardware-based security.

[AK] A system is only as secure as the weakest link. Thus, it’s imperative to start with a strong security foundation and build the system around it. Essentially, it all starts from hardware, where a security IP can be directly combined with a system-on-chip to build a secure system for IoT.

Ambuj Kumar Rambus

In the case of Cryptography Research, we use industry-standard tools and processes for design. Our designs use mostly standard-cell logic, relying on very few external analog blocks to keep our core portable. Our scalable architecture allows customers to make informed decisions about the security needs, performance and area. A standard cell-based IP that needs minimal external macros is lot easier to integrate. Often times, the only analog block required for our IPs is OTP to store private keys.
Once a chip is designed that includes security hardware, its system architecture can be built upon securely. The security hardware thus may enable a private, persistent and authenticated channel over an insecure and public network. The device coupled with security infrastructure can enable remote authentication and audit.
A security infrastructure may provision device specific assets (keys, credentials, profiles, etc.), thus making developing secure system software a lot easier.

How does the ARM Connected Community ecosystem enhance or drive the integration of security features?

[RR] The single largest advantage that ARM has at this time is the unparalleled depth and breadth of its ecosystem. There is no more effective method of securing hardware and software then to have it exercised, attacked, and defended across an enormous installed base. Every aspect of ARM devices, from encryption facilities to the ARM Trust Zone®, has been extensively tested in real world situations. At the current time, the critical mass of the ARM environment is only increasing in size.

[SS] The ARM Connected Community ecosystem enhances and drives the integration of security features through collaboration across a very broad spectrum of industries. From traditional PCs to smartphones, from industrial and embedded controls to entertainment content delivery enablement, the ARM Connected Community leverages an open-standards approach in a collaborative online environment to connect new ARM partners and developers with established players for innovation in areas such as SOC innovation, OS and programming models, and consumer use cases. Benefits of this open-standards approach includes greater interoperability, improved efficiency, more resiliency, and avoiding the potential for being locked-in to a particular proprietary technology or vendor.

[RC] There are hundreds of silicon partners and thousands of Connected Community partners offering solutions to the market. IoT is a diverse market where one size does not fit all, and the ARM ecosystem provides solutions for every conceivable use case from wearables to smart meters.

[AK] ARM has helped highlight the need for security through its TrustZone initiative, and bringing the issue of device security to the forefront of new technologies—including the connected device ecosystem—is imperative to creating a secure landscape. While CRI’s security IPs are designed to provide maximum security by themselves, they can be combined with the TrustZone to be more effective.

Blog Review – Mon. June 23 2014

Monday, June 23rd, 2014

By Caroline Hayes, Senior Hayes

The cost of scaling 3D monolithic devices; automation validation; an enterprise wide behavior model; connectivity vs. Wi-Fi design; what’s happening in the China fabless semiconductor market.

There is something afoot in monolithic 3D circles, detects Zvi Or-Bach, MonolithIC 3D, as he tracks disquiet about a feasible roadmap for the technology. Costs and scaling are clashing, he illustrates his case with some effective charts, including one from ARM, but ends on an optimistic note for the industry.

At Mentor, Jay Gorajia vents his frustration at the disruption to communication flows between design and manufacturing oganizations. He makes an effective case for automation to create consistent DFx reports which are configured specifically for each customer.

Continuing something of a crusade for model-based systems engineering, Todd McDevitt, Ansys, has some sound advice for enterprise-wide dynamic modeling. The checklist has some useful links to webinars and web pages.

An interesting interview with Richard Stamyik, ARM, by ChipDesign’s own John Blyler gets the root of cellular connectivity for M2M and IoT and how it differs from WiFi embedded design.

If you have the travel bug, and believe it’s a case of “Go east, young man” these days, then Richard Goering, Cadence, advises you not to pack those bags, just yet. He reports from the DAC 2014 China Fabless Semiconductor Panel, relating some challenges to some preconceptions, such as production and consumption in the region and investment.

Maybe Michael Posner saw one-too-many films on his travels to SNUG in Israel. His blog begins with a picture of Catherine Zeta-Jones and Zorro, the mysterious swordsman-avenger, but quickly moves on to a namesake with a different spelling: the Zoro Hybrid Prototype for early software development. His enthusiasm is infectious, (even if the film link is tenuous!), and the content is clearly set out to inform.

Blog Review – Mon. June 16 2014

Monday, June 16th, 2014

Naturally, there is a DAC theme to this week’s blogs – the old, the new, together with the soccer/football debate and the overlooked heroes of technology. By Caroline Hayes, Senior Editor.

Among those attending DAC 2014, Jack Harding, eSilicon rejoiced in seeing some familiar faces but mourns the lack of new faces and the absence of a rock and roll generation for EDA.

Football fever has affected Shelly Stalnaker, Mentor Graphics, as she celebrates the World Cup coming to a TV screen near you. The rest of the world may call soccer football but the universality of IC design and verification is an analogy that will resonate with sport enthusiasts everywhere.

Celebrating Alan Turing, Aurelien, Dassault Systemes, looks at the life and achievements of the man who broke the Enigma Code, in WWII, invented the first computer in 1936 and who defined artificial intelligence. The fact he wasn’t mentioned in the 2001 film, Engima, about the code breakers, reflects how overlooked this incredible man was.

Mixed signal IC verification was the topic for a DAC panel, and Richard Goering, Cadence runs down what was covered, from tools and methodologies, the prospects for scaling and a hint at what’s next.

Blog Review – Mon. June 09 2014

Monday, June 9th, 2014

It will be no surprise that this week’s Blog Review is a little DAC-centric, there was plenty of news on the floors of the Moscone Center last week – but there was still a lot of other news that interested bloggers. By Caroline Hayes, Senior Editor

A positive and optimistic message is struck by Brian Fuller, Cadence, as he reports on an inspiring DAC keynote by Mentor Graphics’ Wally Rhines. He urges seeking new areas, new trends and a sensible business model balance.

Among the colorful characters of DAC, Hamilton Carter encountered some zombies, some “wildlife enthusiasts” and some interesting booth chat, and some views of IoT – all recounted here.

Gabe Moretti was also at DAC and routed out some unusual companies, some gems that may have escaped the less observant that address the whole system and not just its parts.

At Dac, Synopsys launched its IP Accelerated Initiative and Michael Posner was involved in its development and cannot contain his excitement and pride at the Top Secret testing that brought this to market and a DAC audience.

Further afield, Jim Martens clocks up the air miles as part of the PADS VX Seminar. The blog focuses on Shenzhen and Taipei seminars and what features were of interest to others.

Expanding on his five-box trick for visualizing steps in simulation, Daniel Shaw, Ansys, looks at Load Mapping techniques.

An encouraging blog from Julio Enrique Fajardo, ARM, as he charts the Bionic RoboHand prototype, in great detail, with pictures and code. The inspirational project makes enthralling reading.

We weren’t going to escape without a DAC mention from ARM. Brad Nemire writes about the winner of the ARM Step Challenge at DAC to find the hardest working, walking attendee. (Congratulations, Tiffany Quan, Sonics – hope the prize was a foot spa!)

Blog Review – Mon. June 02 2014

Monday, June 2nd, 2014

In case you didn’t know, DAC is upon us, and ARM has some sightseeing tips – within the confines of the show-space. Electric vehicles are being taken seriously in Europe and North America, Dassault Systemes has some manufacturing-design tips and Sonics looks back over 20 years of IP integration. By Caroline Hayes, Senior Editor.

Electric vehicles – it’s an easy sell for John Day, Mentor Graphics, but his blog has some interesting examples from Sweden of electric transport and infrastructure ideas.

Thanks are due to Leah Schuth, ARM, who can save you some shoe leather if you are in San Francisco this week. She has been very considerate and lumped together all the best bits to see at this week’s DAC. OK, the list may be a bit ARM-centric, but if you want IoT, wearable electronics and energy management, you know where to go.

We all want innovation but can the industry afford it? Hoping to instill best practice, Eric, Dassault Systemes writes an interesting, detailed piece design-manufacturing collaboration for a harmonious development cycle.

A tutorial on your PC is a great way to learn – in this case, the low power advantage of LPDDR4 over earlier LPDDR memory. Corrie Callenbach brings this whiteboard tutorial by Kishote Kasamsetty to our attention in Whiteboard Wednesdays—Trends in the Mobile Memory World.

A review of IP integration is presented by Drew Wingard, Sonics, in who asks what has been learned over the last two decades, what matters and why.

Blog Review – Tues. May 27 2014

Tuesday, May 27th, 2014

The pursuit of confectionery drives the blog choices this week. Chocolate and Gummi Bears mingle with a connected world, an evolution for IP and a passion project. By Caroline Hayes, Senior Editor.

In preparation for DAC 2014, Hamilton Carter has prepared some tips for travelers to make the most of any free time in San Francisco. The Ghiradelli chocolate factory is on the list – phew.

Semiconductor process technology provides new challenges to IP vendors and integrators, says Gabe Moretti. He speculates on the foundry – IP vendor relationship and where it goes from here.

Reflecting on his fitness regime, Tom De Schutter, Synopsys, tracks his activity and considers the role of software in a connected world that encourages him to keep up the good work.

Finding themselves in possession of a bag of Gummi Bears, Matthew Clarke, Mentor Graphics and his colleagues sacrificed one of the innocent bears to test the 1500A power tester, which was also conveniently nearby. The results are not happy reading for candy lovers.

John Blyler fancies his chances in Mentor Graphics’ Passion Project, created to share creative hobby news. He had better hurry – the contest ends June 3 and you may be able to beat John to the $300 prize, awarded at DAC Wed. June 4 at 3:30 pm at the Mentor booth #1733.

Next Page »