Part of the  

Chip Design Magazine


About  |  Contact

Posts Tagged ‘data’

More space for satellites and a roadmap for data protection

Monday, February 12th, 2018

Blog Review – Monday, February 12, 2018
This week’s selection includes 100G Ethernet for data centers; Satellites will vie for space; A roadmap for data protection, and more from the blogsphere

The rise of data centers and increase in cloud-based computing has prompted Lance Looper, Silicon Labs, to examine how wireless networks are changing to meet the demands for performance and low latency and implementing 100G Ethernet.

Marveling at how connectivity has ‘shrunk’ the world, Paolo Colombo, ANSYS, looks skywards to consider the growth of connected devices. He looks at the role of space satellites and how small satellites will have their day for critical applications and introduces ‘pseudo sats’ which are vying for space in space.

An article about medical device design and manufacturing challenges has prompted Roger Mazzella, QT, to address each and provide a response to reassure developers. Naturally, QT’s products play a role in allaying many fears, but it is an interesting insight into the medical design arena.

An interesting case study is recorded by Hellen Norman, Arm, featuring Scratchy the robot. She asks German embedded systems developer, Sebastian Förster how he used a Cortex-M4, some motors, Lego bricks and cable ties to create a four-legged robot, programmed to walk using artificial intelligence (AI).

It’s not unusual to feel bewildered at a technology conference, so we can sympathise with Thomas Hackett, Cadence, who has a twist on the usual philosophical question of “What am I here for?” A walk through DesignCon caused a lightbulb moment, illuminating the real world interplay of IP, SoC and packaging.

With the IoT there are no secrets, and Robert Vamosi, Synopsys examines how data sharing may not be as innocuous as companies would have us believe, if it is not configured flawlessly. The Strava heatmap which reveals secret military locations has thrown up some serious issues which, we are assured, are being addressed, and which Vamosi sees as a model for other IoT and wearable device manufacturers.

Tackling software-defined networking (SDN) head-on, Jean-Marie Brunet, Mentor Graphics, presents a clear and strong case for accelerating verification using virtual emulation. Of course he advocates Veloce VirtuaLAB PCIe for the task, but backs up his recommendation with some sound reasoning and guidance.

By Caroline Hayes, Senior Editor

Will Moore’s and Metcalfe’s Laws Cross the IOT Chasm?

Sunday, April 30th, 2017

The success of the IOT may depend more on a viable customer experience over the convergence of the semiconductor and communication worlds.

By John Blyler, Editor, IOT Embedded Systems

The Internet of Things will involve a huge number of embedded devices reporting back to data aggregators running servers on the cloud. Low cost and low power sensors, cameras and other sources will allow the IOT to render the real world into a digital format. All of these “things” will be connected together via the Internet, which will open up new business models and services for customers and users. It should greatly expand the human–machine experience.

The key differentiators between the emerging IOT and traditional embedded systems is connectivity. IOT will conceivable connect all embedded things together. The result will be an almost inconceivable amount of data from sensors, cameras and the like, which will be transferred to the cloud for major computation and analysis.

Connectivity means IOT platforms will have a huge data side. Experts predict that the big data industry will grow to about US$54.3 billion by 2017. But the dark side of connectivity is the proliferation of hacking and privacy lapses caused by poor security.

Security is an issue for users as well as for the device developers. Since most IoT devices are resource constrained, designer cannot deploy resource-intensive security protection mechanisms. They are further constrained by the low cost of mass-produced devices.

Another challenge is that most software developers are not particularly security or reliability conscious. They lack training in the use of security testing, encryption, etc. Their code is often not design nor programmed in a defensive fashion.

Finally, since IOT devices will be designed and available on a massive scale, security attacks and failures can be easily propagated. Frequently software security patches will be needed but these must be design for early in the development life cycle of both the hardware (think architecture and memory) and software.

Moore-Metcalf and the Chasm

Connectivity, security and data analysis will make IOT devices far more complex than tradition embedded systems. This complexity in design and product acceptance can be illustrated by the confluence of two laws and a marketing chasm. Let’s consider each separately.

First, there is Moore’s Law. In 1965, Intel co-founder Gordon Moore predicted that transistor density (related to performance) of microprocessors would double every 2 years (see Figure 1). While “doubling every 2 years” suggests a parabola-shaped curve, Moore’s growth function is almost always represented in a straight line ― complemented by a logarithmic scale on the Y-axis.

Figure 1: Moore’s Law (courtesy of Mentor Graphics, Semi Pacific NW, 2015)

Several years later, another technology pioneer, 3Com co-founder Bob Metcalfe, stated that the value of a network grows with the square of the number of network nodes (or devices, or applications, or users, etc.), while the costs follow a more or less linear function. Not surprisingly, this equation is show as a network connection diagram. For example, 2 mobile devices will only able to communicate with each other. However, if you have billions of connected devices and applications, connection complexity rising considerably (see Figure 2).

Figure 2: Metcalfe’s Law.

Metcalfe’s Law is really about network growth rather than about technological innovation. Blogger Marc Jadoul recently noted on the Nokia website that, the combination of Moore’s and Metcalfe’s principles explains the evolution of communication networks and services, as well as the rise of the Internet of Things. The current IoT growth is enabled by hardware miniaturization, decreasing sensor costs, and ubiquitous wireless access capabilities that are empowering an explosive number of smart devices and applications…”

Jadoul realizes that the availability of state-of-the-art technology does not always guarantee success, citing the struggling growth of two main IOT “killer” consumer devices and apps, namely, watches and connected thermostats. The latter is also notorious for its security issues.

He explains this slow adoption by considering the “chasm.” Geoffrey A. Moore wrote about the gap that product marketers have to bridge for a new technology to go mainstream. Jadoul then combines these three charts, admitting the inaccuracies caused by different axis and scales, to observe that the chasm is actually the point where the shift from a technology driven model to a value and customer experience driven business needs to take place (see Figure 3).

Figure 3: Intersection of Gordon Moore’s Law, Metcalfe’s Law and Geoffrey Moore’s “the Chasm. (Courtesy of Marc Jadoul blog.)

This line of reasoning highlights the key differentiator of the IOT, i.e., connectivity of embedded semiconductor devices. But the success of the IOT may depend more on a viable customer experience over the convergence of computational and communication technologies.

Blame It on the Automobile

Wednesday, December 11th, 2013

By Dave Bursky, Technology Editor

Major ARM TechCon keynotes focus on societal shifts, defining IoT and little data’s big impact.

ARM CEO Simon Segars

At the recently held ARM TechCon, a keynote presented by ARM CEO Simon Segars underscored how the automobile was responsible for societal shifts at the beginning of the 20th century. The automobile, he explained, caused the creation of thousands of allied industries while changing the way people interacted with each other. He then looked at how similar disruptions are happening in today’s society:  “Today, the smartphone is doing the same thing. With over one billion smartphones already in use and another billion or so expected by the end of 2014, we are seeing a change in the way people interact. Additionally, there is a lot of innovation in mobile phones, thanks to the availability of highly integrated, low-cost chips. And, just as with the automobile, thousands of allied companies have sprung up that provide personalization with everything from screen protectors and protective cases to external battery packs and lost-phone finders.”

We are on the brink of a new era of innovation. Silicon devices, manufacturing, and software are all less expensive. Using the “cloud” for online computing also is less expensive—and readily accessible as well. Some people feel that innovation is slowing down. But that’s not the case. The car is a good example, as we have seen so many innovations in manufacturing, design, etc. Over the past 100 or so years, the car has seen a Darwinian evolution. In 1899, there were about 30 car manufacturers and then an explosion of vendors to several hundred followed by some consolidation. Today, we see Tesla, the zip car, and other innovations shaping both cars and usage scenarios.

It feels like mobile devices have been with us for a long time, but smart mobile devices and the connectivity they provide are still changing the way we interact—today’s seismic shift.  There is already a growing network of devices for which the phone will be the central hub, acting as a controller—the focal point for all interaction. Thus, the Internet is evolving into the “Internet of Things” (IoT) with many interconnected devices, which will drive new products, services, and opportunities.

A major side effect of this connectivity is data—an explosion of data. The consumption and generation of data have been increasing at a staggering rate. For example, mobile-data usage has grown to over 900 petabytes per month and is still growing. The network must change to handle such growth. Companies are developing a wide range of solutions to meet multiple needs, thus creating a very diverse network. To get this technology into the hands of people in emerging economies, however, such approaches require a lot of collaboration, a major increase in bandwidth, lower network latency, and lower-cost smartphones.

The term “Internet of Things” has been around since 1999. The basic definition of an IoT device includes a sensor, processor, and connectivity. A joint study done by ARM and The Economist investigated how the industry perceives the IoT and how it is being put to work. The study, which is available on the ARM website , revealed that the IoT is not just about things. As a result, skill development must not be an afterthought. In addition, greater cooperation, which is critical for solutions, will emerge. The study also found that roughly three-quarters of companies are exploring the IoT for internal or external use in a wide variety of business areas: industrial, energy management, transportation, healthcare, etc. Companies are also working on full end-to-end solutions, amalgamating lots of little bits of data into big data. That is changing server workloads and the way servers are designed.

John Cornish, EVP and GM, ARM's System Design Division

Big data starts with little data. In his keynote at ARM TechCon, John Cornish, Executive Vice President and General Manager of ARM’s System Design Division, examines how the long tail of little data drives big data value. Little things are going to make a big difference. All of the pieces are starting to come together to connect the physical and digital worlds, explained Cornish—low-cost silicon for sensing and computing, low-power wireless networks for communications (ZigBee and Bluetooth LE), and cloud-based computing and storage for collecting and processing data. To that end, ARM has a major role in the IoT market with the 32-bit processor cores seeing a 40% CAGR as more and more embedded devices hit the market.

The “little” data collected from all of the IoT nodes gets collected to form big data. From there, value can be extracted to help analyze the consumer by answering questions like, “When did I buy it,” “Where did I buy it,” “When do I use it,” “Where do I use it,” “What do I do with it,” “Who do I use it with,” “Who did I tell about it,” etc. Large data analytics applied to the collected data allows vendors to better direct their promotions and product developments to match the consumer’s lifestyle.

Already, over 1.9 billion Cortex-M series cores have been shipped in systems-on-a-chip (SoCs) by leading semiconductor vendors at prices starting as low as $0.50.  To better enable IoT device development, ARM’s MBED platform lets developers connect the ARM Cortex-M0 processors to the cloud. The MBED project will enable IoT device creation on a massive scale, consolidating fundamental embedded building blocks and simplifying integration with enterprise systems. Additionally, open standards will be incredibly important. Those standards will enable interoperability, allowing OEMs, silicon partners, software developers, and middleware vendors to standardize around the same set of protocols to ensure that everything “just works.” Through the acquisition of Sensinode, ARM has opened up to many standards to enable efficient and secure communications from devices to the cloud.  Some of those standards include 6LoWPAN, ZigBee, IPCoAP, TLS, and OMA Lightweight.

However, we still have to provide the resources that developers need in order for them to experiment, innovate, and explore this huge solutions space. I look at this like a Monte Carlo analysis, in which a large number of calculations are done in parallel to explore a very large space (see, “Gambling on the Design of IoT”). The results show what works and what solutions really resonate with consumers. Lastly, we have to address the need for security and trust at a technical level. People are concerned with privacy. But they’re also willing to share information where they perceive value.

Sensors and Algorithms Challenge IoT Developers

Tuesday, December 10th, 2013

By John Blyler, Content Officer

Challenges abound as designers deal with the analog nature of sensors, IP issues and the new algorithms required by the IoT.

Sensors represent both the great enabler and unique challenge for the evolving Internet of Things (IoT). Innovation in the market will come from surprising places. These are just a few of the observations shared by ARM’s Willard Tu, Director of Embedded and Diya Soubra, CPU Product Manager. “System Engineered Design” caught up with them during the recent ARM Tech Con. What follows is a portion of that conversation. – JB

Blyler: Everyone talks about the importance of sensors to enable the Internet of Things (IoT) but few seem to appreciate what that means.  Would you elaborate?

Tu: Sensors are one of our key initiatives, especially from a microcontroller viewpoint (more on that shortly). But there is another aspect to sensors that both designers and even companies overlook, namely, the algorithms for processing the sensor data. These algorithms, from companies like Hillcrest, bring a unique value to the IoT market. And the algorithm software represents a real intellectual property (IP). I think people are missing out on the IP that is being created there.

Blyler: So you think that most people overlook the IP aspects and simply focus on the processing challenges needed to condition analog sensor signals into a digital output?

Tu: Processing power is critical, which is where distributed local and cloud computing comes in. But there are many other factors, such as energy harvesting to power sensors in areas you never thought of before. Both body and mess network communication challenges are another factor. Conversely, one enabler of sensors is their inexpensive cost. Ten years ago, an accelerometer was a really expensive piece of silicon for an automotive airbag system. Now, they are everywhere, even in cell phones which are very cost sensitive.

Blyler: Is this volume cost decrease due to innovation in MEMS design and manufacturing?

Tu: Yes, the MEMS market has evolved immensely (see Figure 1) and that’s the reason. And I think there is still a lot of evolution there. You see a lot of new comers with MEMS applications but I think you’ll see a lot of consolidation because only the strong will survive.

Figure 1: MEMS market trends as presented by Jeremie Bouchaud of the HIS at the MEMS Congress, 2013.

Soubra: Another factor is that few vendors use only one sensor, but rather a lot of sensors. A common example is multi-sensor accelerometers (see Figure 2): one sensor gives you a good pitch, the others give you yaw and roll.  So you will always need three of four sensors, which means that you have to have software to handle all of them.

Figure 2: Device Orientation – 6 Degrees of Freedom. (Courtesy of Hillcrest)

Blyler: Do you mean software to control and integrate data from the various sensors or software algorithms to deal with the resulting data?

Tu: Both – Software is needed to control and ensure the accuracy of the sensors. But developers are also doing more contextual awareness and predictive analysis. By contextual, I mean that a smart phone turns on when it’s being held next to my head. Predictive refers to what I’ll do next, i.e., having the software anticipate my next actions. Algorithms enable those capabilities

This is the next evolution in handling the data. You can use sensor fusion (sensors plus processors) to create contextual awareness.  That’s what people are doing today.  But how does that evolve into predictive algorithms? Anticipating what you want is even more complex than contextual awareness. It’s like using Apple’s Siri to anticipate when you are hungry and then order for you. Another example is monitoring a person’s glucose level to determine if they are hungry – because their glucose levels have dropped. It could be very intuitive or predictive down the road.

Blyler: These smart algorithms are another reason why processing power is a key enabler in the IoT evolution.

Tu: What you really need is scalable processing power. Sensors require a microcontroller, something with analog inputs. But there are still lots of designers who ask, “Why do you need to integrate the microcontroller with the sensor? It’s just an accelerometer.” They seem to forget that data acquisition is an analog process. The sensor data that is acquired must be conditioned and digitized to be useful in contextual or predictive applications. And that requires lots processing.

Another thing designers forget about is calibration (see Figure 3). Calibration is a big deal to get the accuracy necessary for all the contextual awareness applications. Calibration of the MEMS device is only part of the issue. The device must be recalibrated as part of the larger system once it is soldered and packaged to a board, to deal with temperature affects (of the solder) and flexing of the board. All of these things play a part of the system-level calibration.

Figure3 : Proper interpretation and calibration of the sensor data is critical. The performance of the core fusion algorithm depends on the quality of the input data. (Courtesy of Hillcrest)

You might think that, well, the sensors guys should do that. But the sensor guys are good at making a MEMS device. Some MEMS manufactures are vertically integrating to handle calibration issues, but others just want to make the device. This is another area where innovate IP can grow, i.e., around the calibration of the MEMS device to the system.

Blyler: Where will innovation come from as the IoT evolves?

Tu: I think the ecosystem is where innovation will emerge. Part of this will come from taking application developed in one area and applying them to another. Recently, I talked to several automotive developers. They admitted that they lack of expertise in developing certain types of algorithms – the same algorithms that companies like Hillcrest have already created for mobile consumer applications. I would like to introduce the automotive market to an algorithm company (like Hillcrest), a sensor platform provider (like Movea) ant a few other leaders in the mobile space.

I think you will see IP creation in that space. That is where innovation is coming, by taking that raw sensor data and making it do something useful.

Blyler: Consolidation is occurring throughout the world of semiconductor design and manufacturing, especially at the lower process nodes. Do you see similar consolidation happening in the sensor space.

Tu: Right now there is an explosion of sensor companies, but there will be a consolidation down the road. The question one should ask is if integration key to the sensor and IoT space. I don’t know.  As a company, ARM would like to see a microcontroller (MCU) next to every sensor or sensor cluster – whether it is directly integrated to the sensor array or not. This is where scalability is important. Processing will need to be distributed; low power processing near the sensor with higher performance processing in the cloud.  It is very difficult to put a high-powered fan based system in a sensor. It just won’t happen. You have to be very low power near the sensor.

Not only is the sensor node a very power constrained environment but it is also resource constrained, e.g., memory. That’s why embedded memory is critical – be it OTP or flash. In addition to low power, it is the cost of that memory is actually more influencing than the CPU.

Blyler: Thank you.

Extension Media websites place cookies on your device to give you the best user experience. By using our websites, you agree to placement of these cookies and to our Privacy Policy. Please click here to accept.