A VLIW Processor Bids to Dominate the Augmented Reality/Virtual Reality Market

By: Jonah McLeod

If history is any guide to the future, each major next generation consumer device—PC, Smart Phone, and what comes next will each have its own computing architecture. The x86 chip dominated the PC, while the ARM core architecture conquered the smart phone/device market. The question that now arises is what next generation device will drive CPU unit volumes in the future? And what CPU core will become the dominant architecture in this device?

The answer most generally given for the first question is IoT, but the response lacks specificity since it encompasses everything from health and fitness monitors, to smart doorbells, security cameras, and the list goes on. One market that has the look and feel of a next generation hit is augmented/virtual reality. The market research firm Digi-Capital has developed an AR/VR business model that predicts a market worth $120B by 2020. And of this total, around 40 percent, roughly $48B will be hardware sales. Just to provide some perspective, the Apple iPhone was introduced in 2007 and four years later had sales of $45B; the trajectory appears similar.

The AR/VR market differs from the smart phone/device market in that it lacks a major vendor, such as Apple Inc., defining the market and hardware architecture that will become dominant. The AR/VR market includes major players: Microsoft, Facebook, Google, Sony, Samsung, and HTC Corp. and more are coming, but not one of these has set the standard.

Head-mounted displays such as Google Glass and the HTC Vive are the major hardware components of the AR and VR market, respectively. At the Embedded Vision Conference held on May 2 through 4 at the Santa Clara Convention Center, some of the buzz was all about processors best suited for providing the vision processing demanded by these new devices.

I listened to pitches promoting the graphics processor unit (GPU) and the very long instruction word (VLIW) processor as the best compute engine for the image processing function, a major requirement in augmented reality (AR) applications. The Myriad 2 Vision Processor from Movidius, Inc. of San Mateo, Calif. with its 12 on-chip vector processors delivering teraflops of performance on a watt of power struck me as a good candidate because of its ability to deliver high performance on a modest power budget.

In an AR application in which the user wears glasses, the requirement is to capture the physical space being viewed, process all the elements comprising the space and provide the wearer information about the scene; for example the wall of an art gallery with the wearer viewing a painting by El Greco. Capturing the view is achieved through convolutional neural networks (CNN), which involves a large number of matrix multiplications. A VLIW processor uses instruction level parallelism—performing several multiplications at once—to minimize time consuming and power hungry memory accesses.

The other function in an AR application is determining what the wearer’s eye is viewing. Detecting the position of the eye and the scene before the viewer, the AR application can provide detailed information to the wearer that makes AR valuable. The Movidius VLIW processor consists of 12 VLIW engines that can be engaged for image processing, detecting the wearer’s eye position or other compute-intensive functions. The 12 engines are tied together via 2-Mbyte intelligent memory fabric that provides deterministic data locality to minimize memory accesses and an address map for easy programming.

This generation of compute engines seeking to become the defacto standard in the next high volume end application has to accommodate a wide range of packaged software that performs the image, vision, sensor—gyroscope, accelerometer, altimeter, etc.—functions. Thus, not only does the architecture need to provide compelling performance and power capability, it needs a software development environment to port the software modules demanded by the end application, while providing the compiler efficiency to most effectively exploit the hardware architecture, i.e. provide most performance with least power consumption.

Movidius claims to have this combination of architecture and software development environment in its Myriad Development Kit. The proof will come in the form of design wins. As with most great successes, being the CPU architecture in the product that suddenly catches fire and leaves the competition struggling to catch up trumps all other marketing and technical considerations.

Making the 8051 Secure from Hacking in the Smart Home Internet of Things

By Jonah McLeod, Silicon Valley Blogger

Jauher Zaidi, Chairman and Chief Innovation Officer of Palmchip Corp. based in Temecula, Calif. is bullish on the 8-bit 8051 CPU core for applications in the smart home Internet of Things (IoT). This is a market that Park Associates predicts will grow from 25 million units this year to nearly 36 million units in 3017. Zaidi makes the case for why the 8051 is the odd-on favorite to win seats in this lucrative, growing market. He also explains why security is an integral part of the final solution and how his company is contributing to making the solution more secure.

Park Associates details the IoT elements in the smart home that are seeing growth. “Units of smart home devices include smart thermostats, networked cameras, smart door locks, smart water leak detectors, smart smoke detectors, smart carbon monoxide detectors, and smart light bulbs, smart light switches, smart plugs and outlets, and smart power strips,” declared the Park Associates white paper “Smart Home Ecosystem: IoT and Consumers.” The white paper stated that smart home devices have processing intelligence and Internet connectivity through a home network for remote access, monitoring, and control capabilities.

Zaidi stated that the simple, well-understood, and long-serving 8-bit 8051 provides the processing intelligence and connectedness for many IoT devices. He cited the smart LED light bulb as an example. It performs the bulb’s straightforward on-off or dim function. The bulb’s other requirement is to run the communications protocol for Wireline, Zigbee, and Bluetooth. In operation, the homeowner on installing the light bulb(s) connects them to the network just as he would attach any wireline or wireless device by pairing the device to the router.  Thereafter, the homeowner controls the devices through the cloud or WiFi using a smart phone, tablet, and/or PC.

Back in mid-2011, Greenvity Communications Inc., a Milpitas, Calif.-based startup, licensed Palmchip’s AcurX51 Smart Grid Platform.  The platform included an 8051 CPU core and PalmSecure Engine that provides secure data communication between gate way and device with AES secure key encryption. Any commands coming to the 8051 presents a key that is validated against a stored key within the 8051. The random key is changed at intervals to provide an extra layer of security by the cloud providing the smart home service. Zaidi said the 8051 is still the preferred choice in half of the designs going into smart home IoT applications, like thermostats, door-locks, garage door openers, washing machines, dishwashers, refrigerators and microwaves. The low price of the CPU core, its ability to operate on low power, and its small silicon footprint makes it a compelling choice, he asserted.

Greenvity is typical of the new breed of company building chip solutions for the new smart home IoT market. Begun in 2011, to focus on smart LED lighting, controlled street lighting, home and building automation, IoT sensors, smart meters and automotive. In 2012, the company rolled out its Hybrii-XL GV7011 containing the Palmchip CPU core and security IP, a chip that is no bigger than a U.S. dime and about the same thickness.  In January this year, Greenvity Communications and Mitsumi Electric partnered to develop and manufacture a complete IoT solution, “…modules, software, mobile apps and IoT cloud, that enable customers to significantly reduce time-to-market and development costs for energy-efficient, smart lighting applications,” said Greenvity CEO Hung Nguyen.

Greenvity faces stiff competition from giants Qualcomm Atheros and Broadcom. However, as a measure of how lucrative this market is, Greenvity attracted venture financing from Viet Nam-based DFJ VinaCapital as well as corporate partners including Mitsumi Electric.  With chips available to ship, a corporate partner able to help provide a total cloud based solution, all that remains is for OEMs to build home based products to proliferate the technology in homes.

Smart Fabric, Not Wrist Bling, To Lead Wearable Market Growth?

By: Jonah McLeod, Silicon Valley Blogger

There’s incredible hype surrounding the wearable market and most of it is aimed at the wrist where activity monitors continue to move from crude step counter to something much more tripped out. The ultimate example is certainly the $350-ish Apple Watch slated for release in the spring if rumors are to be believed.  In his article, “Wearable market set to explode,” author Dan Cook declares, “the evolution of products designed to measure heart rate, blood pressure, weight and body fat, and to track workouts, will move toward smartwatches.” But, there are other opportunities being overshadowed by the designer bling and it centers around smart fabric.

Consider the measurement opportunities for smart fabric. In the insoles of shoes, smart fabric can sense running style, pronation, gate, contact order, and fit. Covering the head in a skull cap, smart fabric can not only tell the force a player is hit but where on the head the force occurred. Inside a boxer’s gloves, smart fabric can tell how effectively the punch connected to an opponent and, inside a batter’s glove, the behavior of his grip throughout a swing. Sewn onto the arm of clothing, smart fabric can provide the same kinds of controls as buttons on a smart phone: answer call, volume control, advance a track or go back a track.

Of these applications, running is the one with the greatest market potential for smart fabrics.  According to the report issued June 15, 2014 entitled “2014 State of the Sport – Part II: Running Industry Report,” published by Running USA, “Since 2004, total running/jogging participation (run/jog 6+ days/yr) has increased 70% to a record of nearly 42,000,000, according to the NSGA. Females in the 25-34 age group category lead participation totals with more than 5.6 million in 2013, and since 2012, according to NSGA, more women run than men in the USA (both genders are at record highs).” Furthermore, a typical running shoe is retired after 300 to 500 miles of use, at 5 miles a day, that’s roughly once or twice a year, more frequently than that $300 wristwatch.

One company poised to capitalize on this performance-crazed market is Berkeley, Calif.-based, Bebop Sensors. BeBop’s proprietary Monolithic Fabric Sensor Technology integrates all of the sensors, traces and electronics into a single piece of fabric, thus enabling increased sensitivity, resolution, range of deployment and robustness in a practical size. The sensors in the fabric accurately detect pressure, bend, location, rotation, angle, and torsion, to enable the creation of a 3D representation of these forces. This produces the meaningful feedback for athletes seeking to eek out the milliseconds of performance improvement they hunger for. The watch will tell you how fast your heart is beating, your temperature and oxygen level of our blood, great if your concerned about your health, less useful for getting ahead of the game.

Company founder Keith McMillen developed the Bebop fabric technology for his musical instruments company Keith McMillen instruments. The fabric performs the touch function in fabric as the touch sensors in a smart phone or tablet. The uniqueness in Bebop’s smart fabric is its ability to provide a 3D profile of the applied force. For example, the smart fabric beneath the keys of the QuNexus keyboard produces a unique MIDI CC number for every key that is struck. In addition, the fabric sensor detects a key being held after being struck and the orientation of the force of the key in 3D space, a unique sound resulting from the force and orientation of the force.

All of this capability applies equally well if the smart fabric is worn in a running shoe. The 3D force profile can be used to compute ground reaction force (GRF), the shock wave or force that occurs when a runner’s foot strikes the ground.  GRF comprises a vertical, horizontal, anterior and posterior component.  Researchers speculate that the part of a runner’s foot—front, rear, or middle—that contacts the ground in relation to his/her body’s centre of mass is key to performance. Having an insole sensor that pings the runner when he/she hits the sweet spot on each stride could provide a real competitive advantage—in effect, a coach providing continuous advice in real time during a race.

The fabric is still looking for the OEM that will add the hardware and software to turn the potential into a final product. As for cost, in a typical wearable application such as the athletic shoe, if Fitbit is an example, the hardware and software runs around $17.36.  When included in the bill-of-material, BeBop’s insole fabric solution will be competitive the company asserts.  Not this year, but likely in 2015, expect this to be flying off the shelf.

What CPU Core Will Prevail as IoT Drives Microcontroller Growth?

By: Jonah McLeod, Silicon Valley Blogger

The Goldman Sachs Equity Research note “The Internet of Things: Making sense of the next mega-trend” is bullish on IoT deployment for industrial automation.  It declared, “the global industrial sector is poised to undergo a fundamental structural change akin to the industrial revolution as we usher in the IoT.”  The report cites Verizon saving 55 million kWh of electricity from IoT application as an example of the IoT’s impact on internal operations and processes. What’s enabling IoT adoption? The report cites “cheap sensors, cheap bandwidth, cheap processing, smartphones, ubiquitous wireless coverage, big data, and IPv6.”

Given the cost of enabling IoT is so low, where is the opportunity for semiconductor companies? The report highlights sensors and microcontrollers as the categories of chips poised to profit.  The report shows sensors having a compound average growth rate of 5 percent from 2011 to 2013 while the semiconductor industry as a whole managed a 0 percent CAGR. Microcontrollers faired even better growing at an 11 percent CAGR from 2006 to 2013; the semiconductor market for this period had a 3 percent CAGR. Considering that microcontrollers have been around for over 40 years, is their CPU architectures powerful enough and cost effective enough to serve this fast emerging opportunity? Or does a new architecture like the one Hsinchu, Taiwan-based Andes Technology Corp. rolled out in 2005 better suited.

What’s driving the demand for microcontrollers in the IoT? One application is the LED light bulb. According to a 2013 Winter Green Research report “LED Lighting: Market Shares, Strategies, and Forecasts, Worldwide, 2013 to 2019,” the LED lighting market will grow 45 percent per year from $4.8 billion in 2012 to $42 billion by 2019.  At around $10 for the Cree BA19-08027OMF dimmable A19 LED, that means billions of bulbs with microcontrollers over the next five years. Next generation bulbs like the Cupertino, Calif. based Stack Labs, Inc. Alba will surround the microcontroller with sensors, Bluetooth, Zigbee and iBeacon hardware thus allowing the bulbs to react autonomously once installed.

History as an indicator of the future, suggests that every discontinuity represents an opportunity for new invention.  In the computer industry, the CPU architecture that succeeded drove MIPS and MHz—x86, PowerPC, SPARC, MIPS among others sold as packaged parts. With the arrival of the mobile phone, the architecture that ascended offered performance with low power operation in the form of IP blocks that could be built into a system on chip that included specialized processors surrounding the applications processor—one of many ARM variants.

The Internet of Things changes the computing requirement dramatically. Performance is less important but power is critical especially for sensors working off battery power, for example, providing intrusion, motion, and environment detection. This requirement would seem to favor low-cost architectures such as the ubiquitous 8051. However, the need to provide sensor as well as communications stack processing calls for a 32-bit architecture such as the ARM Cortex-M hardware platforms.  According to Wikipedia the Cypress Semiconductor PSoC 4; Infineon Technologies XMC1000; NXP LPC1100 and LPC1200; Nordic Semiconductor nRF51; STMicroelectronics STM32 F0 microcontrollers are built on the Cortex-M.

Until recently, there has not been a new commercially available CPU architecture since the late 1990s. Tensilica’s Xtensa configurable processor started in 1997 and the Argonaut Technologies Limited (ATL) ARC configurable processor began life in 1998. Both are now owned by two EDA giants, not CPU companies—i.e. ARM and Intel—with the will and resources to evolve their architectures over time.  As an historical note, the Acorn RISC Machine project, which led to today’s ARM processor, began life in October 1983. The 1983 project borrowed aspects of the venerable 6502—the original CPU in the first Apple PC.  This early ARM processor inside an LSI Logic-built ASIC powered the Apple Newton, an early Personal Digital Assistant.

In March 2005, Andes Technology Corp. built a next generation CPU architecture using a design team that included talent from AMD, DEC, Intel, MIPS, nVidia, and Sun.  The architecture was tailored to the computational requirements of portable electronics then just coming on the market: most notably the Apple iPod introduced in October 2001 but including all the MP3 players and multimedia players being introduced. By 2005, after a slow start, the iPod was selling in the tens of millions.

The Andes core instruction set combine 16-bit and 32-bit mixed-length instructions for improve performance, code density and power efficiency. It targeted the system requirement of a portable device with a small display, wheel based user input, high-speed USB I/O connector, and mass storage—at first a mini hard drive and by 2005 flash memory. It also includes power management instructions and interface protocol to simplify switching among different SoC power and performance operating modes.

It turned out that what the company designed for battery operated portable devices is also ideally suited for an Internet of Things device, which may perform one or more very simple functions:  e.g. verify a key and open or lock a door.  In addition, the processor needs enough computing power to handle the communications protocols—BlueTooth, WiFi, Zigbee, among others—to connect the “thing” to the network and to run the security software increasing being added to these devices. Andes business model constructed for the constraints of cost-conscious Asian customers, may be best suited to the tight margins demanded by IoT “things.” With over 100 license agreements and cores shipping 500 million units, the company may be poised to be a force in the Internet of Things market.

Ambiq Aims to Enable Wearables a 10-Fold Reduction in Power Consumption

By: Jonah McLeod, Silicon Valley Blogger

The wearable device is in its infancy as a new product category, with only a few years of product shipments. But the market is on track for significant growth. According to the recently published data from ABI Research in Oyster Bay, New York, the global market for smart wearables is forecast to grow from 90 million units this year to 164 million units next year. The forecast includes wearable cameras, smart glass, smart watches, health-monitoring devices, activity trackers, motion trackers, and smart clothing.

As with any infant market, the initial products are crude by comparison to later models to come.  Contrast the Sony Walkman with the MP3 player that replaced it years later eliminating many of the older units drawbacks—size, weight, and power consumption. One drawback of today’s wearable that everyone can point to is battery life. For example, Jawbone boasts that it had doubled the battery life of its UP24 wristband from one week to two weeks. While that’s a considerable improvement, it’s insignificant when contrasted to the battery life of two to five years for a Movado quartz watch.

Ambiq Micro is an Austin, Texas-based fabless semiconductor start-up formed in 2010 that wants to make the considerable improvement in battery life that will bring it closer to what consumers expect from quartz watches, garage door openers, and auto key fobs. Ambiq Vice President of Marketing, Mike Salas says the company has developed new technology that will provide a 10-fold improvement in battery life.  “And we’ve developed and shipped a real-time clock chip that lives up to the claim,” he asserts.  Ambiq’s larger goal is to create an ultra low-power microcontroller-based infrastructure that leverages their patented technology to provide 10-fold improvement in battery life for wearables as well as to devices comprising the Internet-of-Things.

What’s making all of this possible is sub-threshold voltage technology that Ambiq’s co-founders Dennis Sylvester, David Blaauw, and Ambiq CTO and VP of Engineering Scott Hanson, Ph.D. developed at the University of Michigan beginning in 2004.  “Today standard logic CMOS circuits switch transistors between 0v and 1.8v,” Salas explains. “Ambiq’s sub-threshold voltage technology switches standard logic CMOS transistors between 0v and 0.5v. And the technology leverages the leakage current in logic CMOS to accomplish the switching. It’s not possible to build a GHz microcontroller with this technology, but in the wearables and IoT market, power trumps performance.”

To reliably switch transistors at 0.5v is no mean feat. “There is always noise in the sub-threshold voltage domains, which are insignificant if the transistor is switching between 0v and 1.8v,” Salas says. “But with a swing of only 0.5v, noise becomes something that could be misconstrued for signal.  In addition, for any CMOS process node there is a center that designers develop around, but the process can drift thus creating the need to compensate in the low-voltage domain. The same holds true for temperature ranging from -40 to 85 degrees C. Ambiq’s patented technology includes adaptable circuitry that adjusts and accounts for this noise, temperature drift and process variability.”

To achieve their larger goal of more power-efficient system designs, the company is beginning to partner with sensor and radio chip suppliers, display and battery manufacturers, software vendors, cloud suppliers and app makers as well as tooling companies and automatic test equipment vendors. “The goal is to create a low-power infrastructure that produces the most energy efficient solution for the OEM,” Salas explains. “For example, the partnership with a tooling vendor might develop models of the discharge characteristics of various batteries or energy harvesting solution to determine power consumption in an Ambiq-based system with different radios, sensors, and displays. The partnership with the ATE vendor would create testing methods to effectively test logic circuits switching at much lower 0.5v. With software vendors the partnership might include determining how best to develop software to leverage the low-power profile of the Ambiq hardware it’s executing on.”

Using their sub-threshold voltage technology, the company is designing a microcontroller chip around an ARM licensed CPU core, which they will sell through the established semiconductor component channels—direct to major OEMs and through distributors to smaller OEMs. Ambiq’s low-threshold voltage microcontroller by itself will deliver considerable power savings to a wearable or IoT OEM. But involving an infrastructure of suppliers able to exploit the low-voltage computing element will deliver even greater power savings to the OEM. Ambiq plans to begin shipping their ARM-base microcontroller toward the end of the year. Already wearable and IoT OEMs are lining up to sample the low-power compute engine using an Ambiq supplied FPGA platform.

IoT Gets Its Own Messaging Protocol Standard, MQTT

By: Jonah McLeod, Dir. of Corp. Mkt. Comm. at Kilopass Technology Inc.

Anyone creating applications for the Internet of Things (IoT) will soon have a standard messaging protocol, the Message Queue Telemetry Transport (MQTT). Created in 1999 by Dr. Andy Stanford-Clark of IBM, and Arlen Nipper of Arcom (now Eurotech), MQTT began life as a means of hobbyists automating their homes. (See Dr. Stanford-Clark’s YouTube TED Talk for more.) MQTT 3.1.1 entered a 60-day public review period, beginning July 7, 2014, in preparation for a member ballot to consider its approval as an OASIS Standard. The review will complete September 4, 2014.

MQTT enables the equivalent of a social network for things in that it provides a publisher-subscriber model of communications. Consider the sensors in an earthquake early warning system. Each time any sensor (client) detects earth movement it publishes its location and reading, a “topic,” to a server or message broker. Anyone who subscribes to this topic will be notified.  This example illustrates features of MQTT that make it well suited as an IoT communications protocol.  First, the protocol was designed as a lightweight protocol to work over lossy or constrained cellular or satellite network, where connections can be problematic.

In its simplest form, the protocol requires two bytes and allows for 15 different message types (one reserved for a total of 16). Five deal with connection to the message broker (server). Two establish connect and acknowledge connection to the broker.  One disconnects from the broker. The remaining two provide for a keep-alive function for the connection (ping request to broker and ping respond to client).

Five message types deal with clients that publish topics to the message broker using three levels of QoS (quality of service) 0, 1, and 2. With the first, the client publishes a topic without acknowledgement (assuming that as long as there is a network connection, the message broker received the topic). The second (level 1), the client publishes a topic and receives an acknowledgement from the broker.

The third (level 2) has extra handshakes to avoid the client having to republish the topic until acknowledgement. In this level of QoS, the client publishes a topic to the message broker, which generates a publish-receive response to the client. The client then issues a publish-release confirmation to the message broker. The broker then issues a publish-complete response to the client.  In this manner the broker acts as an intermediary between publishing clients and subscriber clients—the equivalent of a social network provider for things.

The remaining four message types deal with the subscriber. Two from the client to broker subscribe or unsubscribe to a topic. Two from the broker to the client acknowledge subscribe and unsubscribe to a topic.

MQTT was designed to accommodate the special needs of mobile applications as contrasted with HTTP, which was designed for the wired Internet.  For example, in Stanford-Clark’s TED Talk, he describes an application to provide commuters updates on ferry service in the English Channel. The application publishes the arrivals and departures of ferries at all the ports in the service area, an example of small amounts of data published continuously throughout the day from many clients published (ferries) to large numbers of client subscribers (commuters).

Pipelines are another example, which illustrates another feature of MQTT –– its ability to scale. In his presentation “MQTT and Java,” at the International Software Development Conference Q Com New York in June 2013, Peter Niblett, IBM Senior Technical Staff Member, cited the pipeline as giving rise to MQTT. Niblett explained that the pipeline had 4000 sensor nodes and had to be expanded two-fold. The existing network was built on polling where every sensor was queried for its status and a response was issued. It took on the order of 20 minutes to interrogate the entire pipeline. Doubling the number of sensors would mean over an hour to perform the task.

The solution Niblett described was based on changing the polling network into an exception reporting system, since many of the sensors’ status—temperature, pressure, control valve setting, etc.—changed only when oil was present. Oil like cargo doesn’t flow continuously but rather in response to orders. And like cargo, the progress of the shipment along its path of travel is important to monitor continuously. Using the exception system built around an early version of MQTT, the pipeline network could easily accommodate growth over time. It also reduced network traffic on the satellite link connecting the pipeline network to the back office servers.

Niblett cited a number of emerging applications that will be built around MQTT, including automobile companies to manage fleets of leased vehicles and ultimately to enable the connected car. Other applications included home monitoring for health and safety, financial notifications, asset tracking for common carriers and supply chain management. With a robust networking protocol like MQTT becoming a standard this year, the Internet of Things is set to enjoy the explosive growth that market analysts are predicting.

Bluetooth Low Energy a Market Catalyst with a Major Security Flaw

By: Jonah McLeod, Dir. of Corp. Mkt. Comm. at Kilopass Technology Inc.

The surge in interest of the wearable device has put a spotlight on Bluetooth Low Energy (LE) as it is the communications mechanism that enables these devices to connect to the user’s mobile phone. The “LE” suffix is important because these devices can operate a week or more on a single battery charge. As a result, a surge in mobile phone linked devices is beginning to flood the market. This has been precipitated by the component chain that has sprung up around Bluetooth LE: semiconductor chips, development boards, and software stacks. Thus, it is possible for a product developer to write a software application such as heart rate monitoring, create a package to hold the device, and start marketing and selling the product. However, in the wild west of Bluetooth LE, security is the one element in the chain everyone seems to be ignoring.

At the origin of this chain are the semiconductor chips. Dave Bursky provided a list of the more recent offerings in his Chip Design Magazine article Wearable Technologies Meet Bluetooth Low Energy. “At the recent Bluetooth World Conference held in San Jose, Calif.,” Bursky wrote, “Broadcom, CSR, Dialog Semiconductor, EM Microelectronic, Nordic Semiconductor and Texas Instruments all highlighted their BT Smart (low-energy) solutions. One of the newest solutions, the DA14580 from Dialog, is a highly integrated Bluetooth chip that incorporates an ARM® Cortex®-M0 32-bit processor core to handle both control operations as well as executing the Bluetooth software stack, thus eliminating the need for a second microcontroller.”

With this complete chip solution the OEM writes the software for the end product: fitness monitor, heart and respiratory rate checker, continuous glucose monitor, among others. Even this can be outsourced to companies such as Elektrobit Corp. The company provides rapid prototypes for market validation and turnkey engineering services from early engineering to after-market services. The company has developed several wearable product concepts and/or prototypes ranging from wrist worn devices to head mounted products.

Before getting the product to market the more difficult task is stopping hackers from stealing the software code to create a clone of a particularly successful product or preventing hackers from altering the product for a malicious purpose. And, Bluetooth LE is very vulnerable to being hacked. Mike Ryan, an analyst with iSEC Partners detailed at the CanSecWest conference on Mar 14, 2014 how easy it was to (1) sniff a Bluetooth LE connection, (2) connect to it, (3) dump HCI (Host Controller Interface (HCI) commands), (4) disassemble the code and (5) clone the device.

To find a vulnerable connection, hackers can use the Ubertooth sniffer, an open source Bluetooth test tool developed by Michael Ossmann. The hardware and software can be purchased for less than $200. The sniffer detects a Bluetooth LE transmission and enables the user to determine its frequency-hopping sequence. Then, Ryan says, the “crackle” tool breaks the Bluetooth LE encryption by exploiting a flaw in the pairing mechanism that leaves all communications vulnerable to decryption by passive eavesdroppers.

Other easily accessible tools to enable the hacking of a Bluetooth LE device that Ryan featured in his CanSecWest 2014 presentation included “gatttool” a simple Linux tool used to manipulate the Bluetooth LE Generic Attribute (GATT) protocol. A third tool, Ryan mentions is the hcidump utility, available at Ubuntu packages. It enables monitoring of Bluetooth activity and can disassemble the Bluetooth traffic. It can also display packets from higher-level protocols such as Radio frequency communication (RFCOMM), service discovery protocol (SDP) and Bluetooth network encapsulation protocol (BNEP).

In his CanSecWest 2014 presentation, Ryan detailed the three levels of Bluetooth LE encryption: (1) just works, (2) 6-digit person identification number, and (3) the more secure out-of-band encryption. The first two are easily circumvented. The third requires more sophisticated methods to break but all are within the realm of possibility using the crackle tool. Though Ryan has made this Bluetooth LE vulnerability public, the problem has yet to be addressed.

The OEM developing the next hot wearable device faces the twin problems of having his design cloned as soon as it ships and having its software hacked to perform functions it wasn’t meant to do. The prime example of the latter is detailed in the paper “Pacemakers and Implantable Cardiac Defibrillators: Software Radio Attacks and Zero-Power Defenses.” It details the ease of which implantable medical devices can be monitored and their function altered. To address both problems, higher levels of encryption are needed. Getting the Bluetooth standards organization motivated to up the level of encryption will require concerted pressure from OEMs participating in the group.

In addition, one-time programmable (OTP) antifuse memory such as supplied by Kilopass has provided a secure storage mechanism for encryption keys used by most major set top box (STB) manufacturers. Applying to Bluetooth LE a similar encryption scheme to that used in STBs and storing the key in tamper-proof OTP memory could greatly improve Bluetooth LE security.

Bluetooth Low Energy Chips Drives New End Markets Leveraging Smartphone Displays, Internet Link

By: Jonah McLeod, Dir. of Corp. Mkt. Comm. at Kilopass Technology Inc.

There is a new wave of start-up companies leveraging Bluetooth Low Energy (LE) chips to build products for new and emerging markets. Besides well funded activity monitors from Nike, FitBit and others, there are hundreds of start-ups taking aim at the burgeoning opportunity.  Nick Hunn of WiFore Consulting describes a number in his article “To Ubiquity and Beyond: Bluetooth Smart (LE) and the Growth of Appcessories.” He lists Green Goose and its sensor-enabled toys—including Teddo the Bear; Parrot’s Flower Power—moisture sensor for plants, among others. Many of these start-ups are bootstrap ventures financed by crowd funding, angels, and in some cases venture-backed. Summing all the small market niches these companies serve results in a good-sized, growing opportunity.  According to statistics portal Statista, in 2013, revenue for the machine-to-machine market amounted to $32.8B growing at an annual rate of 13.4 percent reaching around $54.3 billion by 2017.

These young new ventures share the common need for long battery life, on the order of months or years on a single coin cell battery. This requirement drives the architecture of these new devices. A power profile of any communicating devices—keyboard, mouse, auto key fob, garage door opener—shows that the most power hungry function is the communications link, thus the importance of Bluetooth Low Energy to reduce power consumed by the transmitting radio; for battery life the less communications the better. To achieve the lowest power, what were once modules comprising an individual sensor, radio, and applications processor, for example the FitBit, are giving way to modules with sensor and integrated radio and 32-bit microcontroller that can handle both networking and application processing.

In this highly integrated architecture, the local microcontroller processes data being collected and determines when to activate the radio and send the data to the network, thus saving power over the independently functioning separate chips. Semiconductor vendors are scrambling to fill the rapidly growing number of newly created sockets.  To better understand this transition, let’s consider the new products incorporating these more power efficient integrated devices, then examine the computing and communications resources within these components, and finally look at the chip vendors all competing to get a piece of this emerging market.

New end products coming to market include items as varied as baby activity monitors and multimeters. All share one common characteristic. To understand this commonality, consider the baby monitors being offered by start-up Rest Devices based in Boston. Called Mimo, the product consists of an organic cotton kimono fitted with non-contact machine washable sensors that measure a baby’s respiration. When paired with the attachable Mimo Turtle, it monitors an infant’s respiration, skin temperature, body position, and activity level. All the data is sent to the Mimo Lilypad Base Station via Bluetooth LE. A package of three cotton onesies, the clip-on sensor pack, and a base station costs $199.  As of this May the startup has received more than $300,000 in pre-orders.  Over 4 million babies are born in the U.S. each year according to BabyCenter.com.  A Best of CES 2014 Award Winner the company had no trouble raising $1.2 million in financing.

Now consider the story of Mooshim Engineering and their innovative Mooshimeter, a multimeter that transmits its measurements to a smartphone for display over Bluetooth LE. The advantage is that the meter can leverage the user interface of the ubiquitous smartphone and build more functions into the meter, such as measuring actual power use, power factor, and harmonic distortion with 24-bit resolution. Best of all, the device can be had for $120.  After listing on Kickstarter for a funding goal of $52,000, Mooshim Engineering raised $110,639.  Both these examples each include a low-power communicating sensor leveraging the ubiquitous smartphone’s display, and more powerful processor to provide value to their customers.

Bluetooth LE is the catalyst for this new wave of products and in combination with higher integration provides enhanced value to the customer. Where before a Bluetooth LE radio, independent applications processor and sensor comprised the solution, the new generation integrates the Bluetooth LE radio with at 32-bit applications processor on the same die. The result is a single computing element that efficiently manages power, reduces silicon size and cost. By processing data locally, the integrated solution stores only relevant data—a rise in temperature, heart rate, or other parameters—and communicates this data to a smartphone for additional processing. For the multimeter, the device communicates to the smartphone to display data processed on the device. Architecturally, the new generation chips come with ARM 32-bit low power processors, analog mixed signal functions that measure voltage and temperature and an interface to external MEMS sensors: gyros, accelerometers, pressure transducers, microphones and the list goes on.

The other architectural change is the memory used by these chips:  SRAM, ROM, and Flash.  On next generation chips, embedded one-time programmable NVM such as licensed by Kilopass Technology, are displacing some of the ROM for flexibility and replacing the Flash for cost. The savings in silicon cost can be as high as 50 percent for memory that is only useful during software development and becomes uneconomical to rewrite in the field. This OTP NVM contains network and application program code as well as parametric and configuration data.  OTP NVM enables chip vendors to build one device then configure its personality at final test or in-system, thus serving many vertical markets with a one-chip design. OTP also provides the end customer the ability to securely store his application software to prevent easy counterfeiting of the product or a hacker maliciously changing the application program.

Who are the major chip vendors competing to enable these emerging new consumer products? A partial list consists of Broadcom, CSR, Dialog Semiconductor, LAPIS Semiconductor, NXP Semiconductors, Nordic Semiconductor, Texas Instruments, Toshiba, as well as Qualcomm with publicized product plans.  Since Bluetooth LE opens up a new category of applications, each player has a shot at gaining market share. The criterion for winning is the lowest system power solution at the lowest cost per chip. This new market is a completely green field with each silicon vendor having an equal chance of dominating the market. The differentiation will be in the power and cost of the chip and the product support that enables the end customer to quickly get an end product to market.

Is Energy Conservation the Next California Gold Rush?

By: Jonah McLeod, Dir. of Corp. Mkt. Comm. at Kilopass Technology Inc.

Nothing epitomizes California more than a gold rush, which in the twentieth century took the form of the personal computer, then the smartphone, and most recently the social networking made possible by these devices. For some time, venture capitalists have been betting on energy as the yet another “rush.” While many investors have concentrated on finding new sources of old energy—crude from shale, natural gas, and various forms of renewables, others have been betting on the other side of the energy equation: conservation.

This second group has been given a windfall of sorts in the form of Title 24, California Code of Regulations, 2013 Triennial Edition, which goes into effect on July 1, 2014. The elevator pitch is that Title 24 mandates new commercial construction and remodeling affecting more than 10 percent of an existing structure meet the new building codes defined by Title 24. These new codes require all lighting, temperature, and load plugs be controllable to reduce building energy use. For example, rooms must have occupancy sensors so that lights automatically turn off when no one is present. Lighting must be dimmed in office space that has sufficient sunlight to illuminate the space. Power plugs that are drawing small amounts of energy must be turned off.  And the list goes on.

Companies betting on building automation using Internet of things technology now have a new source of demand generation:  regulatory mandate. One such enterprise is Daintree Networks.  The company’s core competency is energy management software and wireless networking, which was leveraged to enter the energy management market. The initial focus was enabling companies to manage the energy consumed by lighting. It later expanded into thermostat control and is now provides electrical load plug, fan control, carbon dioxide sensing and other environmental conditions monitoring.

The business model centers on connecting devices via wireless networking and managing their operation using software. Thus corporate customers control and manage different devices from lighting to thermostats and all other connected devices to achieve a significant return on investment through vastly improved energy efficiencies. For example one customer achieved a 94 percent savings in energy use over a 12-month period, a 1-year payback of their initial investment according to Mandeep Khera, vice president, marketing and channels at Daintree Networks.

The popular misconception that increasing energy production will solve the nation’s energy requirements has had to confront the reality that new energy development has stopped growing.  The objective of government directives such as Title 24 is to eliminate the need for new energy through conservation.  “Today, commercial buildings in California account for 37 percent of primary energy usage—much of which is wasted,” according to the white paper “Untapped Potential of Commercial Buildings Energy Use and Emissions” published by Next10.org. Title 24 aims to make all buildings in California realize this potential energy conservation.

How would a commercial building be configured to meet Title 24 requirements?  All the building’s lighting would be under control of occupancy sensors to detect when a space is occupied or empty. These sensors communicate to a wireless access controller installed in the building’s ceiling. Daintree software routes wireless sensor data through the controller to a server where the software controls lighting (including dimming, day-light harvesting, on/off based on occupancy etc.), thermostat, and other sensors such as carbon dioxide sensors, smoke detectors, among others, explains Mandeep Khera, vice president, marketing and channels at Daintree Networks.  Based on rules established by the building administrator, the server adjusts lights, air conditioning and heating, plug-loads, etc. to maintain an ambient illumination building temperature, and electrical power consumption.

Zigbee IEEE 802.15.4 is the most prevalent standard in commercial building management for connecting devices to one another and allowing systems to manage those devices, Khera declares. The ecosystem for Zigbee is larger than any other wireless connectivity standard in the commercial building automation space. Automating the lighting for a commercial building is accomplished by adding a small Zigbee wireless adapter to an LED light for granular control or at the circuit level for zonal control. For new construction or replacement of fixtures, the game-changing trend is to embed wireless directly into the LED driver that controls the fixture as done by LG recently with support from Daintree. This eliminates the need for adding an adapter to convert fixtures into wireless devices resulting in significant cost savings. Some devices come with Zigbee embedded, thermostats, etc.  However, the Daintree solution is driven by open standard communications and could also in the future accommodate emerging standards such as WiFI if necessary.

Daintree solution also integrates with OpenADR (Automated Demand Response) Communications standard created by Lawrence Berkeley National Laboratory with funding from California Energy Commission’s Public Interest Energy Research (PIER) program. This allows the utilities to manage supply during peak times. And companies benefit from significant rebates from the utilities.  In the process less energy will be wasted and less new energy production required. In the process, companies such as Daintree Networks aim to ride the energy conservation wave.  Daintree’s OpenADR certified solution provides the ability to use lighting, HVAC, and plug load control strategies to respond automatically to an OpenADR event, ensuring easy Title 24 compliance.

Building controls and energy management are the core for the Enterprise Internet of Things. Ultimately, using open standards like Zigbee will enable all machines in commercial buildings to connect and communicate with one another to create a truly intelligent building.

Convergence and Security will Drive Internet-of-Things Proliferation

By: Jonah McLeod, Dir. of Corp. Mkt. Comm. at Kilopass Technology Inc.

Tony Massimini has been digging into the Internet of Things (IoT) and has come up with some interesting findings.  Semico Research released two reports in January this year “What Does the Internet of Things Need to Grow?” and “The Internet of Things, Augmented Reality, and Sensor Fusion,” detailing what he has learned. You can also get the latest at the SemiCo IMPACT Event on April 23rd at the Biltmore Hotel in Santa Clara, California.  In describing the problems confronting this potentially huge market opportunity everyone keeps referring to as the IoT, he cited a lack of unifying platform to bring a number of divergent solutions together and security as the two obstacles that need to be hurdled.

The IoT is actually a collection of siloed solutions:  industrial control, personal electronics, home automation, etc., he noted.  For example, industrial control, which began as a wired solution to link equipment for food, plastic, or metal casting processing and production line conveyors, machine doors, part loading, etc. has numerous communication schemes for example, CANOpen, DeviceNet, FOUNDATION Fieldbus, Interbus-S, LonWorks, Profibus-DP, and SDS.  Home automation— scheduling and automatic operation of water sprinkling, heating and air conditioning, window coverings, security systems, lighting, etc.—is being fought over by wireless solutions including WiFi, Zigbee, Z-Wave, and BlueTooth as well as wire solutions including HomePlug (over AC wiring) and HomePNA (over phone lines).

Massimini believes the unifying force bringing these disparate communications schemes together is the Internet Protocol version 6 (IPv6), the latest version of the Internet Protocol (IP).  The communications protocol provides an identification and location system for computers on networks and routes traffic across the Internet. IPv6 was developed by the Internet Engineering Task Force (IETF) to deal with the long-anticipated problem of IPv4 address exhaustion. If Cisco’s estimate of 25 billion devices connected to the Internet by 2015 and 50 billion by 2020, IPv6 is not a minute too soon. As to how these billions of IoT devices will communicate, Massimini cites the emergence of reference designs from OEMs including Qualcomm, Broadcom, TI, Freescale, ST, and others that provide intelligent gateways to bring all these devices together and provide IPv6 traffic to where ever on the network.

Once that problem is solved, Massimini sees an even greater one rearing its head: security. The cautionary tale he uses to illustrate this danger is the hacking attack that showed the gaping hole in retailer Target’s network security.  The attack originated from Fazio Mechanical Services (FMS), a Sharpsburg, PA-based heating, ventilation, and air conditioning (HVAC) systems that contracted to Target to provide not only HVAC installation and maintenance but also to monitor and control the environment with Target’s retail outlets. The HVAC system can be accessed via an IP address.  Somehow the hackers acquired the encryption key from FMS required to access Target network connecting point-of-sales (POS) terminals and were able to plant malware that copied every credit card transaction in the POS terminal where it was collected and transmitted the information to servers located at different locations around the globe.

According to the Symantic white paper, “A Special Report on Attacks on Point of Sales Systems” this is not an uncommon occurrence as the software to pull this off is readily available on the web and the incidence are not new as the first happened in 2005, when 170 million card numbers were stolen.  Since the POS system cannot be network-segmented from other networks, Massimini says the solution that seems to be emerging is the replacement of magnetic strip credit and debit cards with smart cards like those used in Europe that employ the Europay, Mastercard and VISA (EMV) set of standards for card payments.  EMV employs an embedded processor with strong transaction security features to protect card data.

Massimini says this lesson hasn’t been lost on OEMs building intelligent IoT gateways and devices who are incorporating crypto engines of their own design in the microcontrollers controlling these products.  This additional security may be late in coming as attacks are already beginning to occur in home according to  Proofpoint, Inc.  The security-as-a-service provider based in Sunnyvale CA claimed to have discovered the first proven Internet of Things (IoT)-based cyberattack.  The company’s press release reported 750,000 malicious email communications coming from more than 100,000 everyday consumer gadgets—home-networking routers, connected multi-media centers, televisions and at least one refrigerator—that had been used to launch attacks.

Implementing more layers of security in the end devices and the gateways they connect to will be costly.  The commercial segments are most likely to accept this cost since there is an immediate benefit to the bottom line.  Providing more security for consumer devices is problematic.  The intelligent gateways for home will need to be the first line of defense.  Keeping these security measures up to date will be another business service.

The Internet of Things is just the latest incarnation in the evolution of computers and communications. As the consumer demand grows for the benefits provided by smart connected devices, hardware and software vendors will build the affordable secure devices these consumers will buy.