Home Artists Posts Import Register

Content

The people behind the greatest leaps in physics - Einstein, Newton, Heisenberg, all had the uncanny ability to see the fundamentals - see the deepest, underlying facts about the world, and from simple statements about reality they built up their incredible theories. Well what if we all had a recipe book for doing exactly this. Well, one might be just around the corner.

------Intro Sequence-----

Pretty much all of physics can be boiled down the following:

Step 1. describe some aspect of the universe with numbers - like the temperature, pressure, etc of a gas or the position (r), velocity (v), etc. of a particle

Step 2. Come up with a set of equations that predict how those numbers change over time - e.g. the laws of thermodynamics or Newton’s laws of motion

Step 3. Based on some initial state - a starting set of these numbers predict how the system will evolve at all future times

Step 4. Profit, by building heat engines, airplanes, skyscrapers, and our entire modern world

In short, physics works by applying dynamical laws to some input state in order to predict an output state. We consider those laws to be “true” when these predictions match reality. This mechanistic approach to describing the world has been wildly successful. In fact it’s hard to imagine another way to do physics. Except that, in many ways, the mechanistic approach may have reached its limit.

It’s nice to have a set of equations for each separate aspect of the world - but if these theories are really true then they should ultimately come together into some master theory - and they do, to an extent. You can derive all of thermodynamics and Newtonian mechanics and electromagnetism and so on from two master theories: quantum mechanics and Einstein’s general relativity. But the master theory that unifies those last two has so far eluded us.

For nearly a century, our greatest minds have been trying to come up with dynamical laws that describe all of nature - a mechanistic theory of everything. And they’ve failed. Some are starting to wonder if we need to rethink how we do physics at the fundamental level. One such approach is constructor theory, developed by David Deustch and Chiara Marletto at Oxford University  since 2012.

In the mechanistic approach the fundamentals are the mathematical descriptions of how a process occurs. In constructor theory, the fundamentals are simpler - they are binary facts about whether or not a particular process is possible. It uses theories like general relativity and quantum mechanics, along with more fundamental conservation laws, or principles, to rule out impossible output states given an input state of a system. The transformations between the input and output states are dubbed tasks, and if a task is possible, then there exists a system known as a constructor which can perform the task.

Constructor theory is inspired by information theory and the theory of quantum computation. As David Deutsch says, if a quantum computer can, in principle, simulate any process in physics, then all of physics can be expressed in terms of the theory of quantum computation. This hails back to the work of John von Neumann, who came up with the concept of the universal constructor - basically a generalization of a universal computer - it’s a system that can perform any computation OR physical task, including creating a copy of itself. Deutsch realized that this notion could be used as a way to describe how nature works - we can break up reality in terms of constructors that can cause changes in the world via “tasks” - transformations that for a given constructor are either possible or impossible. Ultimately, constructor theory is about something called a counterfactual - which in constructor theory is a sort of a meta-fact - it’s the fact of whether or not a given task is possible or impossible for a given constructor. And, in fact, the counterfactuals are more important than the constructors. As Marletto puts it, it’s the science of can and can’t, which is also the title of her pop-sci book on the subject.

The power of constructor theory is that it allows us to explore physics without having to solve the detailed equations of motion. On the one hand that allows us to make far more general statements about how a system can behave. I’ll come back to an example of that. But it also potentially allows us to understand aspects of nature where we don’t even know the dynamical laws. For example, understanding the union of quantum mechanics and general relativity.

Before we get to how constructor theory is going to explain everything, let’s look at a simple example to define some of these concepts. The mechanistic philosophy that dominates physics really started with Isaac Newton, so it’s appropriate to start with the falling apple that apocryphally inspired a lot of this. In Newton’s picture, the apple has a current state, stationary and up in the tree. We use that state as the input, apply the laws of gravity and motion - and we find the output state - that the apple fell to the ground.

In the constructor theory view, we ask what tasks are possible and impossible for the apple given its input state. Is it possible for the apple to stay where it is, hovering in the air? It’s not. The laws of general relativity forbid this task: as a free-falling body the apple must follow a geodesic through spacetime, which results in it falling towards the Earth. What about the task of the apple transmuting into gold? Again, this task is impossible, since it’s ruled out by the laws of quantum mechanics and the principle of conservation of energy. In fact, the only possible task available to our apple is the one which results in the output state that we calculated earlier.

This just serves to give an idea of the perspective shift in constructor theory. Let’s look at a case where this is actually useful. David Deustch gives the example of the perpetual motion machine of the first kind - a device from which infinite energy can be extracted. For example, a wheel powered by falling water that also pumps that same water back up to the top while at the same time driving an electric generator. We can rule out the possibility of this with a mechanistic explanation, talking about the torque induced by the generator to slow the wheel, which stops it pumping water fast enough to maintain the same power.

But that’s not very general. We’d need a separate mechanistic approach to rule out every different type of the perpetual motion machine.  Or we could take a shortcut. The law of conservation of energy says that it’s impossible to create energy from nothing, and the second law of thermodynamics tells us that it’s impossible for a non-isolated system to keep running forever. By applying general rules about what is possible or impossible - by applying counterfactual statements - we can rule out a much larger space of impossible processes than in the mechanistic approach.

You’d be justified in thinking that there’s nothing new here. We already use counterfactuals regularly in physics. There’s a whole family of rules and laws and theorems that say what can and can’t be. We cobble them together into chains of deductive reasoning to carve out our physical laws - they are the sculpture that’s left after we carve off all the “can’ts” from the marble block of all possible mathematics. But there’s something a little ad-hoc about the way these rules are applied. An aim of constructor theory is to formalise our statements about what is possible and impossible into a sort of  “algebra of possibility”.

At its heart, constructor theory is based on information theory and uses related tools like set theory in its formalism. It focuses on what can and can’t be done with information. This means that constructor theory can be applied even when we don’t know the dynamical laws of a system - when we don’t have a full mechanistic theory.

For example Chiara Marletto  has used constructor theory to describe a scenario for testing whether gravity is quantum in nature. And we certainly don’t have a mechanistic theory for quantum gravity. This approach relies on defining systems of information - called information media - and how systems of quantum information - or superinformation media - must differ from regular information media, which would be purely classical. For example, for systems of quantum information there’s a particular task that is possible that is impossible for classical systems. That task is entanglement between the information elements - the qubits in the quantum case.

The thought experiment goes as follows. You have two qubits, which could in principle be entangled with each other. That means their states could be correlated in a way that allows them to have an apparently non-local effect on each other. That entanglement has to be initially induced by a local interaction - the qubits must come into contact. Or it could be induced by a chain of contact - one qubit to another to another to our final qubit. From the definitions of what an information medium is, Marletto argues that this chain of quantum elements is equivalent to a quantum field. And she argues that only a "superinformation medium" - aka a quantum field - could mediate the entanglement of two spatially separated qubits.

Therefore if we could design an experiment that showed that gravity could induce entanglement between separated qubits, then gravity has quantum properties. This is cool because it gives us an experimental test of quantum gravity that has absolutely no dependence on a particular theory of quantum gravity. It doesn't need the dynamical laws of such a theory, or even of quantum mechanics or general relativity as they currently stand. At the very least, it pares down the facts that we need to assume about the mechanistic theory in order to make the argument - in this case, just the way informational elements interact, say, by entanglement.

David Deustch has said that constructor theory may be the most fundamental way to describe reality - in which case the rest of physics can be derived from Constructor Theory once it’s properly developed. That’s a lofty claim, but consider how previous great theories emerged. They came from deductive reasoning - the application of crystal clear logic - to the most stripped-down facts about the world. Einstein’s general theory of relativity came from asking what were the inevitable implications of simple statements like the equivalence principle and the invariance of the speed of light. Werner Heisenberg came up with the first version of quantum mechanics by stripping away all but the bare facts about the nature of electron energy levels - including abandoning any pre-existing dynamical laws.

The Bohr Model with the nucleus and electrons in set orbitals.Heisenberg’s representation of the very same orbitals, now replaced by probability distributions indicating where the electrons are likely to exist.

Deustch has called such efforts “antecedents” of constructor theory, which sounds presumptuous, but it gets to the spirit of the effort: to formalize the process of applying logical deduction from the barest facts - or counterfactuals - of what is possible and what is impossible within this physical space time.

Comments

Anonymous

A Lay-person's Question: What if…, An electron (or…) is siting at rest. Quantum mechanics tells us it’s position can be described as a cloud of probably of where it would be found to be if measured. I wonder, if this electron could be “viewed” at say a Plank Time scale, and if it was found to be jumping from one point in to another randomly at each “tick” of time. Would this seem be compatible with what we see with today’s Technology? i.e. The Double Slit Experiment or optical lattice clock Experiment?