Thanks for posting this, I hadn't see it before! That was an interesting read.
It confirms my general view of Stephen Wolfram: he's a genius, an excellent writer, and he constantly oversells his work.
What he explains in this essay is that rewrite rules for hypergraphs can produce structures that share the essential properties of space-time as it is traditionally postulated as the starting point of physical theory. It is certainly not obvious that this is possible, and it is even less obvious to demonstrate that this is possible. So for anyone interested in possible computational frameworks for explaining the universe, this is very important work.
On the other hand, he clearly says that he has not (yet?) found a set of rewrite rules that is compatible with all the details of our universe as we know it. Which in turn implies that at this stage, he does not have something one could call a scientific model, something that can be used to make testable predictions.
There is a clear tradition for such work in the corner of theoretical physics looking at the foundations. String theory is famous for having lots of enthusiastic contributors in the absence of any reasonable prediction after decades of intense research.
Time will tell if anything more concrete will come out of this. Wolfram repeatedly points to the biggest obstacle on the way to a computational foundation for physics: computational irreducibility. What this means is, basically, that exploring the consequences of a computational model may require too much time for any practical use, because one would be simulating the universe on a computer necessarily much slower than the universe itself. He seems optimistic that he can escape from that problem, but it isn't very clear to me why.