“The problems that exist in the world today cannot be solved by the level of thinking that created them”. Albert Einstein
I have been writing a series of articles, “Are We on the Edge of Chaos”. The last article dealt with how power is shifting to city governments and their need to change in order to manage the new technology paradigms that are appearing as quickly as every ten years. The most significant new technology paradigm is Artificial Intelligence (AI) — Cloud Computing — Internet of Things (IoT) — and it has caused information to grow at exponential rates. The growth rate in information challenges our natural ability to understand and appreciate this information deluge. For example:
· CB Insights estimates that by 2024, 149 zettabytes of data, 149 trillion gigabytes, will be created globally every single day.
· The Internet of Things … is projected to reach 64 billion devices by 2025.
· The newest version of OpenAI GPT, GPT-3, was released in June 2020 and produces approximately 4.5 billion words per day, roughly equivalent to the average daily output of 4.5 million bloggers.
One sector of the economy that is particularly challenged by exponential growth in information is the education system — from pre-K to university — which is the topic of this article. A previous article on education, The 21st Century Renaissance — The Education Century, dealt with curriculum in detail. This article focuses on the new purpose that education must serve and the basic change in philosophy required to serve students in this 21st century of exponential growth in new information.
If we examine information, we find that information lies on a continuum defined by sensory data on the left (representing the beginning of a time series) and knowledge on the right. The diagram below shows how the sensory data is processed to become knowledge. The reduction of uncertainty may not be as linear as shown, but the progress over time is correct.
Data is received through sensory perception. In the interest of energy conservation, the brain is programmed to reject most inputs and only “capture the data”, the information, that enriches the mental model of the person’s reality. If the information is found to be true with predictive value, it is considered knowledge. Why is predictive value so important to the human brain? Prediction is how we manage uncertainty. Life is devoted to managing uncertainty. This is a more profound point than you might think. Jessica Flack, a researcher at Santa Fe Institute, states “individuality is about temporal uncertainty reduction.” For example, quantum mechanics shows us that the position of the most basic particles is no more than a statement about their probability. Our perception of reality is inherently a statement about the “expected value” or probability of matter. Why do humans explore and avoid the bias of assuming the status quo? Exploration — hypothesize, test, update hypothesis, test … — allows us to deal in a more controlled way with the inherent uncertainty of reality. This measured exploration allows us to enhance the ability to survive. Assuming an outcome (the bias) is too risky a behavior. This is a second way to understand uncertainty. A third way to see the uncertainty is to think about genetic mutations. Mutations are stochastic outcomes and inherently uncertain. If the mutation does not enhance species survivability, it does not get passed on to the next generation. If the mutation improves survivability, it gets passed on to a population of sufficient size and moves from a mutation to a defining feature of the species. Evolution is driven by the uncertainty of mutations.
The importance of uncertainty is necessary to set the stage for a discussion of entropy and more specifically entropy in information or Shannon entropy. Entropy can be understood simply as the uncertainty of possible states, or as computational physicist Sharon Glotzer states, “the options …the number of ways you have of doing something”. As the information increases, and in particular the noise and not the signal, the entropy or uncertainty increases. Signal has purpose, for example to transfer the digital music. Noise has no benefit generally except possibly to create the “mutation” that may eventually become purposeful. (There may be a spontaneous event, or mutation, that leads to the scientific discovery or the world-changing invention or the new way of painting.) Noise may also just be noise. As we look to the future with new technology paradigms every ten years, information will double perhaps in hours according to IBM. Separating signal and noise becomes a paramount intellectual challenge. This separation will be necessary to find new concepts and theories to solve the major problems of the 21st century — the environment and wealth inequality. Memorization and recital of facts as the basis of education no longer serve society’s purpose. This 19th century model of education needs to be updated to prepare students to use the new tools made possible by the 21st-century technology paradigm(s). The horse, the buggy and newspapers are nearly gone. Now we need new philosophy, objectives and methods in education. We need them urgently to benefit from the rapid increase in information and knowledge. Education needs to be redefined in order for the average citizen to have some chance to understand the new science, technology and applied mathematics that will dominate the 21st century.
A recent report by the National Security Commission headed by Eric Schmidt, ex-CEO of Google, advocates for a focus on university-level education in AI, but I think that is a short-term solution. Education needs to be redefined at the elementary school level in order to give the greatest number of children the opportunity to adopt the new tools and approaches made possible by the technology. Young students would learn the subjects before they develop a fear and bias against subjects such as quantum mechanics, information theory and complexity theory.
A New Approach to Learning
The observation of the incomparable John Stuart Mill provides guidance which has largely been ignored in education since the early 1800s:
“It often happens that the universal belief of one age of mankind . . . becomes to a subsequent age so palpable an absurdity, that the only difficulty then is to imagine how such a thing can ever have appeared credible.”
Many people now advocate for the teaching of 21st Century skills (emotional intelligence, collaboration, adaptability, leadership, etc. or the 4Cs (creativity, critical thinking, collaboration and communication) in response to the demands of the 21st century. This is good advice, but this advice has been good advice since before the time of Plato and Aristotle. I believe that a transformation in education is required and should be based on a view of how humans will add value in this new 21st-century paradigm of AI-Cloud Computing and IoT. In the late 1950s, then Dartmouth Professor John McCarthy defined artificial intelligence. To paraphrase McCarthy, AI is the ability of a digital computer to perform tasks commonly associated with humans. Today the roles are reversed. The future of education is to train the children to do what the AI cannot do. Education needs to shift from an orientation of memorizing facts, causality, linear thinking and test-taking to a new orientation focused on risk-free exploration of ideas, an understanding of the natural state of uncertainty, multivariate analysis and the role of invention and innovation in society. Essentially, we need to train humans in how to add value, innovate and invent a survivable future.
Another way to look at the problem is to examine how wealth is created. Historically, wealth was created from land, labor and capital. These inputs were processed in mechanistic, linear systems that ignored the environment, focused on efficiency and profitability and transformed the modern world beginning in the late 1700s and the First Industrial Revolution. This trend of creating wealth from matter continued until the 1960s when wealth began to be created digitally. The wealth creation model transitioned to information and capital and there it remains today. nFx, a Silicon Valley venture capital firm, has done a study that shows that 70 percent of the value of all tech firms can be attributed to network effects — communications, platforms, marketplaces, communities. In this environment of network effects, value is created more from innovation around business model and personalized customer experience. Value creation is more abstract and less the tangible matter approach using land and labor. Human-centric design and systems thinking are popular tools to create these types of abstract value, but these fundamental tools are rarely taught before university. (Some schools have experimented with systems thinking in elementary school, but that would be a subject for another article.)
Another theoretical understanding of value comes from the National Science Foundation (NSF) definition of transformative research.
“Transformative research involves ideas, discoveries, or tools that radically change our understanding of an important existing scientific or engineering concept or educational practice or leads to the creation of a new paradigm or field of science, engineering, or education. Such research challenges current understanding or provides pathways to new frontiers.” NSF 2007
Based on the thinking at the NSF, but not limited to science and engineering, value is the ideas and discoveries that change our understanding and lead to the creation of new paradigms, practices or fields of endeavor. This would be a very high standard for education, but the new technologies are pushing us to perform at a higher level if humans want to be more than simply consumers of Universal Basic Income (UBI).
I believe there are five principles that guide this change in public education, this “new philosophy” to teach students to “create value”. I believe this new philosophy should be implemented as early as elementary school . The five principles are:
1. The Component nature of Reality
2. Value is in the theory
3. Learn faster
5. Multidisciplinary learning
1. The Component Nature of Reality
“All of this is part of a much larger shift in the very scope of science, from studying what is to what could be. In the 20th century, scientists sought out the building blocks of reality: the molecules, atoms and elementary particles out of which all matter is made; the cells, proteins and genes that make life possible; the bits, algorithms and networks that form the foundation of information and intelligence, both human and artificial. This century, instead, we will begin to explore all there is to be made with these building blocks.” — Quanta Magazine
George Polya, a legendary math professor at Stanford, said that there were three areas of fundamental knowledge — 1) philosophy, 2) math and 3) physics. Extending this logic, we would say that chemistry is derived from physics and biology is derived from chemistry. The particles of quantum mechanics simply bind with energy to form atoms, the atoms bind to form molecules (chemistry) and the molecules bind to form emergent cells, organs, systems and species (biology). All of nature and reality is described simply as a “LEGO” set of components that randomly produces signal and noise. When a purpose is achieved, we have signal and the system advances in the capability to process information and survive. When we read about Google’s Deep Mind, Jennifer Doudna’s Nobel Prize winning work on CRISPR or Moderna’s rapid development of a COVID vaccine in two months, much of the underlying research is based on the component nature of reality and “combinatorial” approaches. This component nature of reality and combinatorial approaches to discovery is particularly suited to artificial intelligence. Imagine the components are slots in a Las Vegas slot machine. Spin the wheel to determine all the combinations of components that have a particular set of characteristics that are expected to serve a particular purpose. Artificial intelligence spins the wheel for literally millions of combinations and selects the limited number of combinations that match the solution profile to address the problem. Traditional researchers then work with the selection set reduced by AI to validate the hypothesis and, for example, find the vaccine or repurpose an old drug.
What these advances in computing tell us is that the old way of teaching biology and chemistry must be updated from a “classification” approach to a combinatorial, component approach. The old logic of teaching “phylum, class, order, etc.” serves little purpose, although it did get us to where we are today. The new approach would have the physics, biology or chemistry student designing and testing combinations of molecules or using systems thinking to evaluate a student designed biological “system”. In each case the students would learn whether their hypothesis (exploration) was successful based on feedback from an artificial intelligence system. 21st Century experiential learning (in the Cloud) and the students never have to leave home. I hear John Stuart Mill rooting us on!
2. Value is in the Theory
“I think changing how people think is the most durable asset that you can create, because when people change how they think, they then produce in so many areas. It’s the ultimate force multiplier.” — Sendhil Mullainathan
In much of the history of science, we only had the tools, such as microscopes and telescopes, to do classification. This classification approach was particularly true in biology and medical science. For example, we are still in the early days of understanding proteins and their application in medical science. Advances in complexity science have done much to enrich the theoretical study of biology, but we still do not understand the role of emergence in the development of biological systems.
Brian Arthur, the inventor of complexity economics, says that technologies appear to solve the problems of their times. What artificial intelligence does, applied to components, is to permit our pattern recognition abilities to identify faster what can and cannot be explained. AI manages this volume of possibilities economically and more efficiently than a human. What cannot be explained marks the frontier for possible new theory.
How to teach the ability to theorize will be a challenge. Of course, all of philosophy, math and physics is based in theory (fundamentals). Theory is the abstract and when we do science, we are looking to bridge between the abstract and the tangible (matter, energy, information). Perhaps we teach quantum mechanics beginning in elementary school and teach it every year like we do with math. This would reinforce the component nature of reality, show the history of theorizing in physics, establish the abstract-tangible mental model and incidentally probably teach applied mathematics. It would be really fun to design the curriculum to train elementary school teachers in quantum mechanics. Many people are put off by the study of quantum mechanics, but perhaps if we approach it as an epistemology it is easier to understand but still useful.
Note: Philosophy might serve as the material to teach reading beginning at a young age. Excluding all French philosophers except Descartes and Poincaré, and limiting Hegel, Schopenhauer, Nietzsche, Kant and Wittgenstein to only excerpts, children could explore concepts and theorize about epistemology, metaphysics and metaethics. Eastern thought would perhaps provide a bridge between quantum mechanics and philosophy. (I am fascinated by the fact that Buddha appears to have understood quantum mechanics.)
3. Learn Faster
Einstein, always prescient, taught us to never memorize what you can look up. If McKinsey is correct that information is doubling every eighteen months or in hours as IBM predicts, facts and memorization is no longer a fundamental part of learning. We will need to learn faster to properly produce future scenarios, update mental models and identify bias and faulty assumptions in order to create a more viable future.
A human alone may no longer be able to keep up with the flow of new information, although more and better search tools are emerging using AI. BCG, the international consulting company, believes that learning faster will be a competitive advantage for corporations and humans. New tools will appear to address the challenge of learning faster. U.S. EdTech investment totaled $2.2 billion in 2020. This record investment, despite the pandemic, supports the idea that the new tools and methods of learning are under development to meet an urgent need. We may need to return to teaching tools in schools the way I was taught to use a slide rule in high school physics.
To complement the advances in AI and the new tools, we need to restore Jean Piaget’s philosophy that students need to be self-directed learners in order to manage the increased flow of information and learn faster. We no longer should focus on the facts and instead teach new skills. What we need to teach the students today is how to pick what to read, how to establish the credibility of the writer and how to build frameworks to make the information sorting, organization and internalization as energy efficient as possible for the human brain.
Note: We may also need to teach the computers to read to us faster. For example, the reader could pick a setting to only read topic sentences to start. (Tip: I only read a paragraph if the topic sentence has new information or an insight for me. I read 4–5,000 articles per year that way.)
In 2006 Jeanette Wing, then department chair in computer science at Carnegie Mellon University, wrote an article, “Computational Thinking”. In the article she proposed that we should treat computation the same way we teach reading, writing and arithmetic and we should add a “C” for computation to STEM. She described computational thinking as automated abstraction that gives us the ability and audacity to scale. (Wing’s focus on the abstract is the same point about theory that I made above.)
This view of Wing shaped Carnegie Mellon and lead it to become perhaps the top university for the research and teaching of AI. The rest of the world largely ignored her and much of the education system at all levels still does. Very hard for a teacher to teach what they have not studied and do not appreciate.
Today in 2021 the need for people to learn and understand computation, simply defined as AI, modeling and simulation, is even greater. Cloud computing is taking over computing, providing full service computation with tools, data storage, algorithms, connectivity, cybersecurity and instant scaling for any problem with a data set. (Cloud computing revenue in 2019 — $227 billion.) Cloud computing changes the shape of the adoption curve for AI and computation as a whole, as CRISPR and Moderna have shown. If you could not read, you were probably left behind in 1900. If you do not understand computation and cloud computing, in 2030 you will be left behind. The good news is the Harvard Business School (HBS) approach. HBS does not train the student to be an expert, they train the student to pick the expert they need for the problem. We need to take the same approach and teach the students the fundamentals of computation (probably starting with statistics in junior high school). The good news is most of the computation material is available free online in courses for your local school district to use. We just need to incorporate online learning for computation (and quantum mechanics and information theory) into our traditional classroom approach. Many such curriculums are available online. An example from KDNuggets is shown below.
5. Multidisciplinary Learning
In Jeanette Wing’s article on computational thinking she used the diagram above. This diagram makes clear another important point about 21st-century learning and education. Computer science, artificial intelligence and computation will be integrated into every field of study and not just science, engineering and medical science. The NSF provides a good example with their Convergence Grant Program, as described below.
“Using a convergence approach and innovation processes like human-centered design, user discovery, and team science and integration of multidisciplinary research, the Convergence Accelerator program seeks to transition basic research and discovery into practice — to solve high-impact societal challenges aligned with specific research themes (tracks).”
“Human-centric design” for “high-impact societal challenges” sounds as relevant to anthropology and poverty alleviation as to biology and medicine and that is the message from NSF. With the increase in information, value will be created by seeing the relationships between disciplines and the social sciences and humanities may be the big beneficiaries. Computation has to date tended to focus on the natural sciences, engineering and parts of medical science, but the advances in network theory, game theory and behavioral studies aid the application of computation in a much wider range of domains. Complexity theory teaches us that all-natural and manmade systems are non-linear, multi-variable systems with the same features of adaptability, self-organizing and emergence. Computation may give us the toolset to finally allow us to solve some of the most pressing social problems. This trend toward the multidisciplinary suggests that the silos by discipline that have been a hallmark of education and especially universities will need to come down. As Wing suggests, computation must become a part of every discipline.
Quantum mechanics shows us the component nature of reality and artificial intelligence is the tool we use to process the components. The daunting challenge is for the education system to teach students how to add value in this new computation-based system. However, there may be another equally challenging need. At the most fundamental level democracy is about access to information. When we look at the range and volume of data in the 21st century, one realizes that access to computing power, analytical tools and the knowledge to use the tools in a cloud computing environment is probably as fundamental as reading or access to information. If the educated citizen is necessary to preserve democracy, data access and the skills to analyze the data need to be made available to every student and citizen. In addition, I think that each city should make available trained staff and access to cloud computing resources for any citizen or organization that has a data set to analyze. Public access to city-funded computation education and resources will reduce the risk of the marginalized population increasing and hopefully facilitate the modernization of representative democracy.
“Computational thinking is a fundamental skill for everyone, not just for computer scientists. To reading, writing, and arithmetic, we should add computational thinking to every child’s analytical ability.” Jeannette M. Wing
The views expressed herein are the author’s personal views and do not represent the views of any organization that employs the author or with whom he is affiliated.
 Some scholars such as the late Nobel Laureate Murray Gell-Mann substitute history for philosophy.
 Arthur, B., 2009, The Nature of Technology, Free Press
 My students instinctively know that they need a new way to learn, but they are not yet successful in developing the methods alone. Historically public education has not taught students how to do self-directed learning.