We introduce an intuitive algorithmic methodology for enacting automated rewriting of string diagrams within a general double-pushout (DPO) framework, in which the sequence of rewrites is

chosen in accordance with the causal structure of the underlying diagrammatic calculus. The combination of the rewriting structure and the causal structure may be elegantly formulated as a weak 2-category equipped with both total and partial monoidal bifunctors, thus providing a categorical semantics for the full multiway evolution causal graph of a generic Wolfram model hypergraph rewriting system. As an illustrative example, we show how a special case of this algorithm enables highly efficient automated simplification of quantum circuits, as represented in the ZX-calculus.

# Why Does the Universe Exist? Some Perspectives from Our Physics Project

## What Is Formal, and What Is Actualized?

Why does the universe exist? Why is there something rather than nothing? These are old and fundamental questions that one might think would be firmly outside the realm of science. But to my surprise I’ve recently realized that our Physics Project may shed light on them, and perhaps even show us the way to answers.

We can view the ultimate goal of our Physics Project as being to find an abstract representation of what our universe does. But even if we find such a representation, the question still remains of why that representation is actualized: why what it represents is “actually happening”, with the actual stuff our universe is “made of”.

It’s one thing to say that we have a rule or program that can reproduce a representation of what our universe is doing. But it seems very different to say that the rule or program is “actually being run” and is “actually generating” the “physical reality” of our universe.

# The Wolfram Physics Project:

A One-Year Update

## How’s It Going?

When we launched the Wolfram Physics Project a year ago today, I was fairly certain that—to my great surprise—we’d finally found a path to a truly fundamental theory of physics, and it was beautiful. A year later it’s looking even better. We’ve been steadily understanding more and more about the structure and implications of our models—and they continue to fit beautifully with what we already know about physics, particularly connecting with some of the most elegant existing approaches, strengthening and extending them, and involving the communities that have developed them.

And if fundamental physics wasn’t enough, it’s also become clear that our models and formalism can be applied even beyond physics—suggesting major new approaches to several other fields, as well as allowing ideas and intuition from those fields to be brought to bear on understanding physics.

Needless to say, there is much hard work still to be done. But a year into the process I’m completely certain that we’re “climbing the right mountain”. And the view from where we are so far is already quite spectacular.

# ZX-Calculus and Extended Wolfram Model Systems II: Fast Diagrammatic Reasoning with an Application to Quantum Circuit Simplification

This article presents a novel algorithmic methodology for performing automated diagrammatic deductions over combinatorial structures, using a combination of modified equational theorem-proving techniques and the extended Wolfram model hypergraph rewriting formalism developed by the authors in previous work. We focus especially upon the application of this new algorithm to the problem of automated circuit simplification in quantum information theory, using Wolfram model multiway operator systems combined with the ZX-calculus formalism for enacting fast diagrammatic reasoning over linear transformations between qubits. We show how to construct a generalization of the deductive inference rules for Knuth-Bendix completion in which equation matches are selected on the basis of causal edge density in the associated multiway system, before proceeding to demonstrate how to embed the higher-order logic of the ZX-calculus rules within this first-order equational framework. After showing explicitly how the (hyper)graph rewritings of both Wolfram model systems and the ZX-calculus can be effectively realized within this formalism, we proceed to exhibit comparisons of time complexity vs. proof complexity for this new algorithmic approach when simplifying randomly-generated Clifford circuits down to pseudo-normal form, as well as when reducing the number of T-gates in randomly-generated non-Clifford circuits, with circuit sizes ranging up to 3000 gates, illustrating that the method performs favorably in comparison with existing circuit simplification frameworks, and also exhibiting the approximately quadratic speedup obtained by employing the causal edge density optimization. Finally, we present a worked example of an automated proof of correctness for a simple quantum teleportation protocol, in order to demonstrate more clearly the internal operations of the theorem-proving procedure.

# What Is Consciousness? Some New Perspectives from Our Physics Project

## “What about Consciousness?”

For years I’ve batted it away. I’ll be talking about my discoveries in the computational universe, and computational irreducibility, and my Principle of Computational Equivalence, and people will ask “So what does this mean about consciousness?” And I’ll say “that’s a slippery topic”. And I’ll start talking about the sequence: life, intelligence, consciousness.

I’ll ask “What is the abstract definition of life?” We know about the case of life on Earth, with all its RNA and proteins and other implementation details. But how do we generalize? What is life generally? And I’ll argue that it’s really just computational sophistication, which the Principle of Computational Equivalence says happens all over the place. Then I’ll talk about intelligence. And I’ll argue it’s the same kind of thing. We know the case of human intelligence. But if we generalize, it’s just computational sophistication—and it’s ubiquitous. And so it’s perfectly reasonable to say that “the weather has a mind of its own”; it just happens to be a mind whose details and “purposes” aren’t aligned with our existing human experience.

# After 100 Years, Can We Finally Crack Post’s Problem of Tag? A Story of Computational Irreducibility, and More

## “[Despite] Considerable Effort… [It Proved] Intractable”

In the early years of the twentieth century it looked as if—if only the right approach could be found—all of mathematics might somehow systematically be solved. In 1910 Whitehead and Russell had published their monumental *Principia Mathematica* showing (rather awkwardly) how all sorts of mathematics could be represented in terms of logic. But Emil Post wanted to go further. In what seems now like a rather modern idea (with certain similarities to the core structure of the Wolfram Language, and very much like the string multiway systems in our Physics Project), he wanted to represent the logic expressions of *Principia Mathematica* as strings of characters, and then have possible operations correspond to transformations on these strings.

In the summer of 1920 it was all going rather well, and Emil Post as a freshly minted math PhD from Columbia arrived in Princeton to take up a prestigious fellowship. But there was one final problem. Having converted everything to string transformations, Post needed to have a theory of what such transformations could do.

# Hypergraph Discretization of the Cauchy Problem in General Relativity via Wolfram Model Evolution

Although the traditional form of the Einstein ﬁeld equations is intrinsically four-dimensional, the ﬁeld of numerical general relativity focuses on the reformulation of these equations as a 3 + 1-dimensional Cauchy problem, in which Cauchy initial data is speciﬁed on a three-dimensional spatial hypersurface, and then evolved forwards in time. The Wolfram model oﬀers an inherently discrete formulation of the Einstein ﬁeld equations as an a priori Cauchy problem, in which Cauchy initial data is speciﬁed on a single spatial hypergraph, and then evolved by means of hypergraph substitution rules, with the resulting causal network corresponding to the conformal structure of spacetime. This article introduces a new numerical general relativity code based upon the conformal and covariant Z4 (CCZ4) formulation with constraint-violation damping, with the option to reduce to the standard BSSN formalism if desired, with Cauchy data deﬁned over hypergraphs; the code incorporates an unstructured generalization of the adaptive mesh reﬁnement technique proposed by Berger and Colella, in which the topology of the hypergraph is reﬁned or coarsened based upon local conformal curvature terms. We validate this code numerically against a variety of standard spacetimes, including Schwarzschild black holes, Kerr black holes, maximally extended Schwarzschild black holes, and binary black hole mergers (both rotating and non-rotating), and explicitly illustrate the relationship between the discrete hypergraph topology and the continuous Riemannian geometry that is being approximated. Finally, we compare the results produced by this code to the results obtained by means of pure Wolfram model evolution (without the underlying PDE system), using a hypergraph substitution rule that provably satisﬁes the Einstein ﬁeld equations in the continuum limit, and show that the two sets of discrete spacetimes converge to the same limiting geometries.

# Multiway Turing Machines

# Video work logs

January 5, 2021

January 9, 2021

January 10, 2021

January 11, 2021

January 18, 2021

January 23, 2021

January 25, 2021

January 28, 2021

*Over the years I’ve studied the simplest ordinary Turing machines quite a bit, but I’ve barely looked at multiway Turing machines (also known as nondeterministic Turing machines or NDTMs). Recently, though, I realized that multiway Turing machines can be thought of as “maximally minimal” models both of concurrent computing and of the way we think about quantum mechanics in our Physics Project. So now this piece is my attempt to “do the obvious explorations” of multiway Turing machines. And as I’ve found so often in the computational universe, even cases with some of the very simplest possible rules yield some significant surprises….*

## Ordinary vs. Multiway Turing Machines

An ordinary Turing machine has a rule such as

✕
RulePlot[TuringMachine[2506]] |

that specifies a unique successor for each configuration of the system (here shown going down the page starting from an initial condition consisting of a blank tape):

✕
RulePlot[TuringMachine[2506], {{1, 6}, Table[0, 10]}, 20, Mesh -> True, Frame -> False] |

# Growth Functions, Rates and Classes of String-Based Multiway Systems

In context of the Wolfram Physics Project, a certain class of abstract rewrite systems known as “multiway systems” have played an important role in discrete models of spacetime and quantum mechanics. However, as abstract mathematical entities, these rewrite systems are interesting in their own right. This paper undertakes the effort to establish computational properties of multiway systems. Specifically, we investigate growth rates and growth classes of string-based multiway systems. After introducing the concepts of “growth functions”, “growth rates” and “growth classes” to quantify a system’s state-space growth over “time” (successive steps of evolution) on different levels of precision, we use them to show that multiway systems can, in a specific sense, grow slower than all computable functions while never exceeding the growth rate of exponential functions. In addition, we start developing a classification scheme for multiway systems based on their growth class. Furthermore, we find that multiway growth functions are not trivially regular but instead “computationally diverse”, meaning that they are capable of computing or approximating various commonly encountered mathematical functions. We discuss several implications of these properties as well as their physical relevance. Apart from that, we present and exemplify methods for explicitly constructing multiway systems to yield desired growth functions.

# Combinators: A Centennial View

*See also:*

**Combinators and the Story of Computation**

*Watch the livestreamed event:*

**Combinators: A 100-Year Celebration**

## Ultimate Symbolic Abstraction

Before Turing machines, before lambda calculus—even before Gödel’s theorem—there were combinators. They were the very first abstract examples ever to be constructed of what we now know as universal computation—and they were first presented on December 7, 1920. In an alternative version of history our whole computing infrastructure might have been built on them. But as it is, for a century, they have remained for the most part a kind of curiosity—and a pinnacle of abstraction, and obscurity.