Intelligence as a Complex System: Lessons from Physics
Introduction
In the rapidly evolving field of artificial intelligence, there’s a growing need to understand the nuances and complexities of intelligence itself. By drawing analogies from successful physical models of complex systems, we can gain valuable insights into the nature of intelligence and the challenges we face in replicating it. This essay explores the parallels between intelligence and other complex phenomena in physics, highlighting why simplistic approaches to AI may fall short of true artificial general intelligence (AGI).
The Complexity Spectrum
Just as physical phenomena exhibit varying degrees of complexity, intelligence exists on a spectrum. Consider the following examples:
- Turbulence in fluid dynamics:
- Simple: Rayleigh–Bénard convection near the transition point.
- Complex: Plasma behavior around supermassive black holes (Reynolds number \(\sim 10^{20}\)).
- Intelligence tasks:
- Simple: Grade school math problems.
- Complex: Developing groundbreaking scientific theories (e.g., the Ising model in ferromagnetism).
This spectrum illustrates that, like turbulence, we may not yet know if there’s an upper limit to intelligence. It also suggests that for many tasks, a simplified or “compressed” representation might suffice, explaining why some believe AGI has been achieved based on performance in limited domains. The complexity of turbulence is well characterized by the Reynolds number, but studies in the scaling laws of LLMs leaves a lot to be desired where claims of emergence are being made on simple datasets with fixed task complexity, input and output lengths. See Are Emergent Abilities of Large Language Models a Mirage? for a refreshing take on this.
The Danger of Hype: Reductionist Thinking
Historically, scientists have fallen into the trap of reductionism – the belief that complex phenomena can be fully explained by understanding their fundamental components. Paul Dirac’s 1929 statement exemplifies this:
“The underlying physical laws necessary for the mathematical theory of a large part of physics and the whole of chemistry are thus completely known.” Source
However, just as a unified theory of fundamental forces wouldn’t explain emergent phenomena in physics, a single AI model is unlikely to capture the full spectrum of intelligence. This reductionist thinking can lead to overestimating the capabilities of current AI models, which excel in specific tasks but struggle with generalization and complex reasoning.
Key Aspects of Intelligence as a Complex System
Dynamic nature: Intelligence is inherently non-equilibrium, dynamic, nonlinear, and high-dimensional. A static language model, no matter how advanced, cannot fully capture these aspects.
Hierarchical complexity: Like physical systems, intelligence requires flexible frameworks that allow for hierarchical representations. In physics, we have:
- N-particle descriptions (e.g., molecular dynamics)
- Kinetic descriptions (e.g., Boltzmann, Vlasov equations)
- Fluid descriptions (e.g., Navier-Stokes equations)
- Mean field descriptions (e.g., filtered turbulent fields)
AI research needs analogous frameworks to capture different levels of cognitive processes. However, one might argue that just like how one can use synthetic turbulence models through stochastic forcing to good effect for modeling unobserved physics across scales, GPT-like models with an analogus stochastic training process might be able to capture higher-level cognitive processes.
Meta-frameworks for problem-solving: Current AI models lack robust strategies for approaching intractable problems. For instance, when experimental data is scarce or the physical system under consideration is extremely complex, breakthroughs in physics often came from constructing simplified models that capture the essence of complex phenomena (e.g., the Ising model for ferromagnetism).
Data limitations: Current AI models have almost no inductive biases and learn everything from the data, which has limitations:
- Web-scale data has complex reasoning tasks only in their long-tail, which makes complex reasoning difficult to learn.
- Detailed reasoning steps are not available since humans only write down the final answer.
- Equal weighting of data samples is sub-optimal since only a fraction represent high-quality research and content.
- Examples from long-term scientific development are randomly distributed in the data and not systematically organized (e.g., the evolution of theories over decades).
The specific case of equal weighting of data is particularly problematic since many articles and books are written by people without adequate expertise and contain arguments without rigorous theoretical calculations and experimental results.
The Unique Pace of AI Development
Despite these challenges, AI research progresses at an unprecedented rate compared to other complex fields thanks to:
- Immediate access to state-of-the-art models through APIs and open-source libraries.
- Enhanced tooling for data ingestion and generation.
- The ability to build upon existing work rapidly.
Contrast this with research in turbulence or computational fluid dynamics, where reproducing results can take years because of:
- Complex, million-line codebases in C, C++, Fortran.
- Limited access to high-performance computing.
- Lack of open data sharing practices, and detailed model descriptions for reproducibility.
As an example, consider the difficulty in generating \(1000\) time steps for a turbulent flow simulation and contrast it with generating \(1000\) tokens in language models through the interface, API or locally hosted models.
Conclusion
The very idea of artificial “intelligence singularity” or its opposite is problematic from a scientific perspective where singularities are an indication that our theory is invalid at those scales. The fact that we so readily discuss “singularities” in the context of intelligence might indicate fundamental limitations in our current models of cognition and AI.
While the rapid progress in AI is exciting, we must approach claims of AGI with caution. By viewing intelligence through the lens of complex systems, we gain a more nuanced understanding of the challenges ahead. Just as physicists continue to grapple with phenomena like turbulence, AI researchers must embrace the multifaceted nature of intelligence, developing new frameworks and approaches to capture its full complexity.
Citation
@online{nauman2024,
author = {Nauman, Farrukh},
title = {Intelligence as a {Complex} {System:} {Lessons} from
{Physics}},
date = {2024-06-15},
url = {https://fnauman.github.io/posts/2024-06-15-intelligence-complex/},
langid = {en}
}