2021 — Are Business Firms Obsolete: Will Corporations be Replaced by Self-organizing Networks
· The trade-off between administrative and transaction costs shape an organization and its business model.
· The rebirth of open source software and edge computing strengthens non-hierarchical approaches to problem-solving.
· Partners networked together to solve computing intensive problems is the new alternative to the traditional “monolithic” corporation.
KEYWORDS: Complexity, network, self-organizing, innovation, wealth creation
Adam Smith legitimately claims the title of “father of modern economics” for his book Wealth of Nations written in 1776. This original work led economists to consider three players in an economic system — the individual, the firm and the market. These players created wealth from land and labor, according to Smith. These resources and the newly discovered technologies of mechanization were the basis for the emerging industrialization in much of North America and Europe and the validation of the first wealth creation model in modern times.
For almost one hundred sixty years no one asked or answered the question, “why do firms exist”. In considering the allocation of resources in markets, why does a “firm” offer benefits not found in other means of organization? In 1937 Ronald Coase wrote a paper, “The Nature of the Firm”, for which he was awarded the Nobel Prize some fifty-four years later in 1991. Coase posited that the purpose of a firm was to reduce transaction costs. Transaction costs are the costs to secure for the firm resources of all forms in a market. At some point a firm realizes that by bringing an activity in house they can lower costs by converting transaction costs to administrative costs. The natural outgrowth of such a realization is economies of scale and economies of scope. In the first case one amortizes the administrative costs over a larger number of units of the same product. In the later case, economies of scope, one amortizes the administrative costs through the same production process but producing different products. This realization about scale or scope becomes one of the motivators for the owner or manager to seek a comparative advantage of lower product cost. The desire for a comparative advantage in cost becomes a prime motivator for the expansion of the business scale and the realization of opportunity.
If we look back at the last 250 years of business history, we realize that changing views on what should be administrative costs and what remains transaction costs explains business models in a given period. These changes in business models were generally made possible by the emergence of new technologies. In fact, every change in popular business models is brought about by technologies that change the way we look at administrative and transaction costs in order to achieve lower operating costs and greater efficiency. This point is shown no more dramatically than with the introduction of computers in the 1950s. Not only did computing generate new business models, it changed the way we thought about wealth creation and it changed our view of industrial process from mechanical to digital. As HBS Professor Clayton Christensen has described it, large corporations easily justify adopting new technologies when cost savings are evident. When cost savings are not obvious to large corporations, the technology becomes disruptive and the adoption is driven by the transaction cost approach. From the “disruptive innovation” of Christensen comes the new business models.
The economist Michael Munger writing in “Forever Contemporary: The Economics of Ronald Coase” makes an interesting related point.
“What if an entrepreneur could sell reductions in the transaction costs of renting, using a combination of delivery services and software platform, such as Uber? The third entrepreneurial revolution will be based on innovations that reduce transactions costs, rather than reducing the costs of the products themselves. An unimaginable number and variety of transactions will be made possible by software innovations that solve three problems: (a) information, (b) transaction-clearing, and © trust. The result will be that the quality and durability of the items being used (in effect, rented) will increase, but the quantity of items actually in circulation will plummet.”
The large opportunity is to control a cost factor by achieving near zero marginal costs and control the product/service as a transactional cost. This in part explains the popularity of computing, which is the basis for the wealth creation model that succeeded “land, labor and mechanization”.
The emergence of widespread computerization also provided what the scholar Carlotta Perez calls “New or Redefined Infrastructures”. For every new “industrial revolution” there is new infrastructure(s). In the case of computing the new infrastructure was the “network”, beginning with ARPANET, then intranets, then Internet and eventually maybe Blockchain. The popularity of networked computers is explained by the Nobel economist FA Hayek.
“The knowledge of the circumstances of which we must make use never exists in concentrated or integrated form but solely as the dispersed bits of incomplete and frequently contradictory knowledge which all the separate individuals possess. The economic problem of society is thus not merely a problem of how to allocate “given” resources…it is a problem of the utilization of knowledge which is not given to anyone in its totality.”
The popularity of the Internet proves Hayek’s point that the problem is “utilization of information” and computer-based networks are the best alternative that we have to solve the problem. What one realizes about manmade networks is the efficiency a network provides. Networks provide connectivity, communication, operations and management, all in a self-organizing mechanism for [pursuing] information.
Herbert Simon, another Nobel Laureate economist, writing on hierarchies [networks] cites three reasons why networks are so common:
1. Networks facilitate the formation of complex systems, as explained by Metcalf’e’s Law
2. Networks have direct channels of communication (connectivity)
3. Networks are naturally redundant (lower transaction cost)
[Note: Metcalfe’s Law states that networks follow a scale-free power-law distribution. Every additional node in a network increases the value of the network. As the noted network scientist Albert Barabási explains it, “This feature [power law] was found to be a consequence of two generic mechanisms: (i) networks expand continuously by the addition of new vertices [nodes], and (ii) new vertices attach preferentially to sites that are already well connected…large networks is governed by robust self-organizing phenomena that go beyond the particulars of the individual systems.”]
In hindsight, one realizes that the computer made possible low-cost digital networks that enabled humans to self-organize to find and create information at a low marginal [transaction] cost. These low transaction costs spawned an enormous proliferation of information, which is now doubling every eighteen months according to McKinsey and many other analysts.
A new digital model of wealth creation emerged based on networked information and capital rather than the traditional inputs of land and labor. A study by venture capital firm NfX makes this point about wealth creation abundantly clear.
“Our three-year study … shows that nfx [networks] are responsible for 70% of the value [stock market capitalization] created by tech companies since the Internet became a thing in 1994.”
The Bank for International Settlements (BIS) makes even clearer the point about the new wealth creation model being principally from networked information and capital.
“The gross market value of all [derivatives] contracts [in 2017]…approximately $12.7 trillion…The total notional amounts outstanding for contracts in the derivatives market is an estimated $542.4 trillion.”
To put this number in perspective, currency and interest rate swaps (derivatives) totaled less than $1 trillion in 1980 according to ISDA. This astronomical growth in derivatives is attributable to:
· The value creation from networks (NfX) that created the capital
· The improved networks and the widespread availability of telecommunications that made it easy to find trading partners anywhere in the world
· The deregulation of U.S. financial markets in the 1980s, that permitted the self-organizing feature of networks to flourish and expand the number of nodes (financial intermediaries)
· The fact that financial markets have always been based in forward looking information
[Note: Derivatives, also referred to as options, are financial instruments that allow for the hedging of risk (e.g. price changes) in physical assets such as commodities and currencies. Effectively, the derivative allows for information (price outlook) to be traded rather than the physical asset. Gross market value is the price to buy the derivative contract. Total notional amount is the value of the total amount of physical assets controlled by the contract.]
However, this wealth creation on the back of networks came at a price. The volume of information created was unparalleled in human history and not all of the information was useful. Nevertheless, it had to be stored (temporarily), organized and processed ideally for the purposes of analysis and prediction. This unmet need spawned cloud computing, with AWS emerging in 2006. By 2017 market demand supported 90 AWS services, including computing, storage, networking, database, analytics applications and developer tools. The popularity of AWS and the other major cloud service providers, such as GCP and Azure, was built on the simple concept of Coase’s transaction cost. AWS offered servers and server farms that they operated and maintained, scalable data storage and application software. They priced the service based on transaction cost and usage. Computing became a marginal cost where scale, operational integrity and security were in the hands of unquestionably qualified professionals (AWS, GCP, Microsoft). Effectively, cloud computing and artificial intelligence (AI) combined to create the new information infrastructure business models.
The provision of cloud computing not only was a natural outgrowth of the proliferating Internet network, but it also supported the reemergence of AI as a viable tool for data analysis. AI had floundered almost from its inception in the 1960s in the mistaken belief that AI needed better computers for processing. By the beginning of the 21st century many academic and corporate scientists and engineers had realized that all AI really needed was more data to train the algorithms. Cloud computing (and its data storage) was one of the technologies that made AI affordable and widely available at only marginal cost. Proof of the popularity of the cloud-AI business model is the growth of AWS. By 2019 AWS offered 75 more services (165) than it did in 2017, many of which simplify the use of AI in data analysis and prediction. AWS had annual revenue of $7.7 billion in 2018 and operating income of $2.2 billion. In 2019 AWS revenue reached $35.0 billion and operating income of $3.54 billion, further evidence of the new wealth creation model of information and capital. GCP and Azure have similar statistics, where they are simplifying the use of AI in order to grow their cloud businesses.
Business models built on cloud computing and AI may still be in their infancy, although Google (1996) may be the poster child for this type of business model. Companies like Dropbox were early adopters. Some may argue that cloud computing-AI may be the new redefined infrastructure Carlota Perez talks about in each industrial revolution. However, another candidate infrastructure is already starting to emerge built using the competing concepts of open source software and edge computing.
If we return to Metcalfe’s Law, we should note Barabási’s comment.
“…large networks [are] governed by robust self-organizing phenomena that go beyond the particulars of the individual systems”
All natural and manmade systems are self-organizing. Such systems are also non-hierarchical, non-linear, adaptive and emergent. We call such systems “complex adaptive systems” (CAS).
With the advent of cloud computing, we eliminated the need for large nodes with expensive information processing power and eliminated the investment barrier to achieve a desired user level of computing. According to network science, reducing the influence and control of large nodes happens naturally in large well-organized networks. Cloud computing accelerated the trend toward self-organizing networks.
One of the groups that has embraced the benefits of self-organizing networks is the open source developer community. Almost from the advent of computing in the 1960s, academics banned together to write and share code. Well-known examples of open source software are the first web browser Netscape, the Linux operating system and GitHub. This philosophy even extended to physical products such as the One Laptop per Child computer developed at the MIT Media Lab. Open source software eventually became the code base for commercial products. Red Hat Linux may be the most well-known example and their $34 billion dollar acquisition by IBM highlights the current trend toward commercial products built on the back of open source code. The acquisition or IPO of Cloudera, MongoDB and GitHub, amongst many “open source” companies, further documents the current popularity of “open source” companies. Another supporting data point is Microsoft, the patron saint of proprietary software. In 2018 they released 60,000 software patents as open source software.
The current popularity of open source software is explained by the prominent venture capital firm Andreessen Horowitz.
“The history of open source highlights that its rise is due to a virtuous cycle of technology and business innovation. On the technical side, open source is the best way to create software because it speeds product feedback and innovation, improves software reliability, scales support, drives adoption, and pools technical talent.”
Large, well-organized networks support “product feedback”, “scales support”, “drives adoption” and “pools technical talent”. All of these characteristics document the technical innovation that is a hallmark of open source software made possible now through well-connected networks. Cloud computing and SAAS further support the open source approach by respectively providing low cost computing resources (and AI tools) and a successful business model for commercialization.
However, both cloud computing and SAAS, run contrary to the spirit of open source. Open source would never want to see centralized economic power, what some call “choke points”, such as might be said of Google or Amazon. This point might become even more concerning when we think of the amounts of personal data captured and monetized by Google or Facebook.
In response to this concern, we now see emerging what ZDNet describes as a move:
“that leverages the portability of micro data centers (µDC) and small form-factor servers to reduce the distances between the processing point and the consumption point of functionality in the network.”
In much of the world, such an approach reduces the reliance on high-bandwidth fiber-optic links and massive investment in real estate and buildings to house server farms. This “reversal” to local computing also better supports the development of local talent by providing a hands on experience that they will never get at Khan Academy or EDx. Perhaps most importantly, such an approach shifts data ownership to the local users, which facilitates the bottom up, non-hierarchical approaches that CAS theory predicts. Rarely will the remote data owner have the power to dictate data availability, usage or commercialization strategy.
One of the best examples of this “shift to local” is edge computing and its big brother Internet of Things (IOT). Karim Arabi in an IEEE DAC 2014 Keynote speech defined edge computing as “all computing outside the cloud happening at the edge of the network, and more specifically in applications where real-time processing of data is required”. The importance of real time information is made obvious by the evolution of “products” from physical items to “customer experience” and soon “personalized experience” created digitally. The defining focus on “experience” is made possible by real time data coupled with AI processing. Well known examples of real time data captured at the edge include weather data, AUV (autonomous unmanned vehicle) data and facial recognition data. And the ultimate edge computing device Fast Company describes well, “…mobile devices have become vending machines for services delivered to your phone via the cloud”. Properly managed, edge computing not only allows control of data but the creation of value locally. If software was “eating the world”, now data is eating the world and the world wants to capture a bigger share of the value through edge computing.
However, the question becomes how does a research center in remote India, a startup in Des Moines, Iowa and an open source project based at Cambridge University all working on the same problem create and capture value. We know that capital and information are now the basis for value creation. NfX told us we should probably use a network business model to create value. We know that networks facilitate creativity, collaboration and the exchange of information and networks are best formed as self-organizing, non-hierarchical organizations according to CAS. Ronald Coase would probably say that such a business model could reduce administrative costs. Clayton Christensen would say that corporations adopt products and services that reduce administrative costs.
Maybe we should think of partners networked together to solve computing intensive problems as the new alternative to the traditional “monolithic” corporation. I think neither Coase nor Christensen might be surprised by this idea. Computing has changed the way we think about value creation and business model. Now maybe the time to redefine the corporate “organization” and perfect the self-organizing network alternative.
It may help understanding to show some examples where this self-organizing model has worked. The One Laptop per Child (OLPC) project that I worked on for 3+ years and Vaxine Pty Ltd. might be early models. Beginning in 2005 OLPC cobbled together an open source community of coders and component developers, project managers in sixty countries and commercial Taiwanese partner Quanta Computers to manufacture the laptops. $35 million in corporate donations provided the initial capital. The result, in eight years three million children received free laptop computers to allow them to do self-directed learning on the Internet.
Vaxine is an Australian developer of vaccines that has put 14 vaccines into clinical trials since 2002 with a staff now of only about 150. Vaxine has an international network of partners around the world, in France, India, the U.S. and Australia to name a few locations, and including research centers, universities, corporates and government agencies (who provide most of the funding). Vaxine exemplifies the creativity, innovation and value creation of collaborative self-organizing, non-hierarchical networks where expertise, deliverables and economic returns are understood and mutually agreed. In the case of both OLPC and Vaxine, the collaboration fostered by the self-organizing network is the source of the innovation to tackle the complex problems. [Note: the speed with which the COVID-19 vaccine was developed by others is due in part to the self-organizing approach that overcame many of the typical regulatory issues in developing new medicines.]
As AI solves more and more simple and complicated problems, humans will be left with only the complex, impossible problems to solve. Nature advocates for a CAS approach to solve complex problems. Perhaps the international networks of organizations such as OLPC and Vaxine, self-organized and non-hierarchical, are evidence of such a “natural” business model for organizations. Open source and edge computing technologies just ease the transition to these new business models.
 Simon, H.A., “The Architecture Of Complexity” (Proceedings Of The American Philosophical Society 1962)