A team of theoretical physicists working with Microsoft today published an amazing pre-press research paper that describes the universe as a self-learning system of laws of evolution.
In other words, we live in a computer that learns.
The big idea: Bostrom’s simulation argument has been a hot topic in scientific circles lately. We recently published “What if you live in a simulation, but there is no computer” to propose a different theory, but Microsoft pulled a cosmic “hold my beer” with this article.
The paper titled “The Autodidactic Universe” published today at arXiv is 80 pages and provides a pretty good surface argument for a novel, nuanced theory of everything.
Here is my take: Based on my interpretation of this paper, the universe would either exist or it would not exist. The fact that it exists tells us how that worked out. Whatever the invention (the law) led to it set the stage for everything that would happen next.
The paper argues that the laws that rule the universe are an evolutionary learning system. In other words: the universe is a computer and is not perpetuated in a fixed state, but by a series of laws that change over time.
How does it work That’s the hard part. The researchers explain the universe as a learning system by invoking machine learning systems. Just as we can teach machines to perform unfolding functions over time, that is, to learn, the laws of the universe are essentially algorithms that work in the form of learning operations.
According to the researchers:
For example, when we see structures that resemble deep learning architectures In simple autodidactic systems, could we imagine that the operative matrix architecture in which our universe develops laws developed itself from an autodidactic system that arose from the lowest possible starting conditions?
It’s poetic when you think about it. We understand the laws of physics when we observe them, so it makes sense that the original law of physics should be incredibly simple, self-perpetuating, and capable of learning and development.
Perhaps the universe didn’t start with a big bang, but with a simple interaction between particles. The researchers allude to this humble origin by saying, “Information architectures typically reinforce the causal forces of rather small collections of particles.”
What does that mean? If you ask me, the game is rigged. Scientists describe the constantly evolving laws of the universe as irreversible:
One implication is that if the evolution of laws is real, it is likely to be unidirectional, otherwise it would be common for laws to revert to previous states, perhaps even more likely than to find a new state. This is because a new status is not random but must meet certain restrictions, while the status of the immediate past has already met restrictions.
A reversible but evolving system often explored its immediate past at random. If we see an evolving system that has periods of stability, this is likely to be the case develops unidirectionally.
To illustrate these points, the researchers cite the image of a forensic scientist trying to recreate how a particular program led to a result. In one example, the expert could simply check the magnetic markings left on the hard drive. In this way, the results of the program are reversible: there is a history of their execution.
However, if the same expert were to try to determine the results of a program by examining the CPU, which is arguably the body most responsible for running it, it would be much more difficult. There is no intentional internal record of what a CPU is doing.
You would need to study how each particle that interacted with its logic gates while in operation changed to paint the historical picture of a computer program by internally observing its CPU at work.
The consequences: If the universe operates over a set of laws that are simple at first but self-taught (self-learning) and therefore can evolve over time, it might be impossible for humans to ever unify physics.
Accordingly, the rules governing concepts such as the theory of relativity may have had different operational operational consequences 13.8 billion years ago than they did 100 trillion years ago. And that means that “physics” is a moving target.
These are, of course, all speculations based on theoretical physics. Surely the researchers don’t literally mean that the universe is a computer, do they?
We investigate whether the universe is a learning computer.
Part of the theory seems to suggest that the universe is a learning computer, as the laws currently constraining it weren’t set in stone to begin with.
We cannot reverse the universe as a process because there are no internally verifiable records of its processes – unless there is a cosmic hard drive floating around somewhere in space.
Finally: Our scientists are bogged down in following last year’s models of physics as the self-taught universe, which has brought itself to its knees, continues its evolving laws through eternity.
This is a preprinted paper. So don’t consider it canon just yet, but it’ll pass the mustard on the initial inspection. It all comes out a little: “I just got back from the pharmacy and had a few thoughts,” but researchers do a lot of legwork to describe the types of algorithms and neural network systems that such a universe would create, and themselves consist of.
Ultimately, the team describes this work as “small steps” towards a broader theory.
Published on April 9, 2021 – 20:20 UTC