Monthly Archives: November 2014

What CPU Core Will Prevail as IoT Drives Microcontroller Growth?

By: Jonah McLeod, Silicon Valley Blogger

The Goldman Sachs Equity Research note “The Internet of Things: Making sense of the next mega-trend” is bullish on IoT deployment for industrial automation.  It declared, “the global industrial sector is poised to undergo a fundamental structural change akin to the industrial revolution as we usher in the IoT.”  The report cites Verizon saving 55 million kWh of electricity from IoT application as an example of the IoT’s impact on internal operations and processes. What’s enabling IoT adoption? The report cites “cheap sensors, cheap bandwidth, cheap processing, smartphones, ubiquitous wireless coverage, big data, and IPv6.”

Given the cost of enabling IoT is so low, where is the opportunity for semiconductor companies? The report highlights sensors and microcontrollers as the categories of chips poised to profit.  The report shows sensors having a compound average growth rate of 5 percent from 2011 to 2013 while the semiconductor industry as a whole managed a 0 percent CAGR. Microcontrollers faired even better growing at an 11 percent CAGR from 2006 to 2013; the semiconductor market for this period had a 3 percent CAGR. Considering that microcontrollers have been around for over 40 years, is their CPU architectures powerful enough and cost effective enough to serve this fast emerging opportunity? Or does a new architecture like the one Hsinchu, Taiwan-based Andes Technology Corp. rolled out in 2005 better suited.

What’s driving the demand for microcontrollers in the IoT? One application is the LED light bulb. According to a 2013 Winter Green Research report “LED Lighting: Market Shares, Strategies, and Forecasts, Worldwide, 2013 to 2019,” the LED lighting market will grow 45 percent per year from $4.8 billion in 2012 to $42 billion by 2019.  At around $10 for the Cree BA19-08027OMF dimmable A19 LED, that means billions of bulbs with microcontrollers over the next five years. Next generation bulbs like the Cupertino, Calif. based Stack Labs, Inc. Alba will surround the microcontroller with sensors, Bluetooth, Zigbee and iBeacon hardware thus allowing the bulbs to react autonomously once installed.

History as an indicator of the future, suggests that every discontinuity represents an opportunity for new invention.  In the computer industry, the CPU architecture that succeeded drove MIPS and MHz—x86, PowerPC, SPARC, MIPS among others sold as packaged parts. With the arrival of the mobile phone, the architecture that ascended offered performance with low power operation in the form of IP blocks that could be built into a system on chip that included specialized processors surrounding the applications processor—one of many ARM variants.

The Internet of Things changes the computing requirement dramatically. Performance is less important but power is critical especially for sensors working off battery power, for example, providing intrusion, motion, and environment detection. This requirement would seem to favor low-cost architectures such as the ubiquitous 8051. However, the need to provide sensor as well as communications stack processing calls for a 32-bit architecture such as the ARM Cortex-M hardware platforms.  According to Wikipedia the Cypress Semiconductor PSoC 4; Infineon Technologies XMC1000; NXP LPC1100 and LPC1200; Nordic Semiconductor nRF51; STMicroelectronics STM32 F0 microcontrollers are built on the Cortex-M.

Until recently, there has not been a new commercially available CPU architecture since the late 1990s. Tensilica’s Xtensa configurable processor started in 1997 and the Argonaut Technologies Limited (ATL) ARC configurable processor began life in 1998. Both are now owned by two EDA giants, not CPU companies—i.e. ARM and Intel—with the will and resources to evolve their architectures over time.  As an historical note, the Acorn RISC Machine project, which led to today’s ARM processor, began life in October 1983. The 1983 project borrowed aspects of the venerable 6502—the original CPU in the first Apple PC.  This early ARM processor inside an LSI Logic-built ASIC powered the Apple Newton, an early Personal Digital Assistant.

In March 2005, Andes Technology Corp. built a next generation CPU architecture using a design team that included talent from AMD, DEC, Intel, MIPS, nVidia, and Sun.  The architecture was tailored to the computational requirements of portable electronics then just coming on the market: most notably the Apple iPod introduced in October 2001 but including all the MP3 players and multimedia players being introduced. By 2005, after a slow start, the iPod was selling in the tens of millions.

The Andes core instruction set combine 16-bit and 32-bit mixed-length instructions for improve performance, code density and power efficiency. It targeted the system requirement of a portable device with a small display, wheel based user input, high-speed USB I/O connector, and mass storage—at first a mini hard drive and by 2005 flash memory. It also includes power management instructions and interface protocol to simplify switching among different SoC power and performance operating modes.

It turned out that what the company designed for battery operated portable devices is also ideally suited for an Internet of Things device, which may perform one or more very simple functions:  e.g. verify a key and open or lock a door.  In addition, the processor needs enough computing power to handle the communications protocols—BlueTooth, WiFi, Zigbee, among others—to connect the “thing” to the network and to run the security software increasing being added to these devices. Andes business model constructed for the constraints of cost-conscious Asian customers, may be best suited to the tight margins demanded by IoT “things.” With over 100 license agreements and cores shipping 500 million units, the company may be poised to be a force in the Internet of Things market.