Gabe Moretti, Senior Editor
During my conversation with Luco Lanza last month we exchanged observations on how computing power, both at the hardware and the system levels, had progressed since 1968, the year when we both started working full time in the electronics field. And how much was left to do in order to achieve knowledge based computing.
Observing what computers do, we recognized that computers are now very good at collecting and processing data. In fact the concept of the IoT is based on the capability to collect and process a variety of data types in a distributed manner. The plan is, of course, to turn the data into information.
Definitely there are examples of computer systems that process data and generate information. Financial systems provide profit and loss reports using income and expense data for example, and logic simulators provide the behavior of a system by using inputs and output values. But information is not knowledge.
Knowledge requires understanding, and computers do not understand information. The achievement of knowledge requires the correlation and synthesis of information, sometimes from disparate sources, in order to generate understanding of the information and thus abstract knowledge. Knowledge is also a derivate of previous knowledge and cannot always be generated simply by digesting the information presented. The human brain associates the information to be processed with knowledge already available in a learning process that assigns the proper value and quality to the information presented in order to construct a value judgment and associative analysis that generates the new knowledge. What follows are some of my thoughts since that dialogue.
As an aside I note that unfortunately many education systems give students information but not the tools to generate knowledge. Students are thought to give the correct answer to a question, but not to derive the correct answer from a set of information and their own existing knowledge.
We have not been able to recreate the knowledge generating processes successfully with a computer, but nothing says that it will not be possible in the future. As computing power increases and new computer architectures are created, I know we will be able to automate the generation of knowledge. As far as EDA is concerned I will offer just one example.
A knowledge based EDA tool would develop a SoC from a set of specifications. Clearly if the specifications are erroneous the resulting SoC will behave differently than expected, but even this eventuality would help in improving the next design because it would provide information to those that develop a knowledge based verification system.
When we achieve this goal humanity will finally be free to dedicate its intellectual and emotional resources to address those problems that derive directly from what humans are, and prevent most of them, instead of having to solve them.
At this moment I still see a lot of indeterminism with respect of the most popular topic in our field: the IoT. Are we on the right track in the development of the IoT? Are we generating the correct environment to learn to handle information in a knowledge generating manner? To even attempt such a task we need to solve not just technical problems, but financial, behavioral, and political issues as well. The communication links to arrive to a solution are either non-existent or weak. Just think of the existing debate regarding “free internet”. Financial requirements demand a redefinition of the internet. From a communication mechanism akin to a utility, to a special purpose device used in selective manners defined by price. How would a hierarchical internet defined by financial parameters modify the IoT architecture? Politicians and some business interests do not seem to care and engineers will have to patch things up later. In this case we do not have knowledge.