I’ve been thinking recently that maybe there’s an elegant way of describing Time simply as difficulty. Mathematicians now have a charmingly naive term, “hardness,” for describing the relative knottiness of a calculation problem. If asked to give an explanation to a layperson, they will often say something like: “Well, suppose you had a perfect computer. A hard problem takes it more time to solve than an easy one.” If you have a problem like the “traveling salesman” puzzle–given n cities, how do you figure out the shortest route by which he can visit all of them–it’s really easy if you have three or even five cities, but if you have a few hundred, no computer in the universe, using the fastest theoretical algorithm, could solve it in less than a billion years, by which time the salesman would long ago have moldered into atoms. These algorithms are called “NP hard.” There are even more difficult problems still, ones that ask, for instance, whether it is possible to prove whether or not a given problem has an algorithm to solve it at all–i.e. problems that can only be solved by the emergence of as-yet-undiscovered further problems that will require unimaginable algorithms of the future if they are to be solved.
If we describe the universe as a computational system–and the fact that all science expresses its conclusions in numerical terms strongly suggests that science “votes with its feet” for that hypothesis–then we can see all entities in the universe as the workings of problem-solving algorithms. The ones that are easily solved have been solved already and have stopped, and constitute the eternal constants of physics that are true at every instant through all time, like the instantaneous coexistence of the probabilistic quantum world described by David Bohm. The ones that are a bit harder but still solvable are the deterministic processes in nature that Newtonian science describes. The ones left over constitute the whole world of change and becoming, ranging from chemical reactions through self-cloning living organisms to ourselves, arranged in a nice pyramid of emergent temporal features as described by the great philosopher J. T. Fraser.
If time is a river, there would be nothing to indicate the passage of time unless some parts of the river were flowing faster than other parts. “Hardness” gives us a nice way of measuring which ones are faster and which ones are slower.
Here’s a nice thought-experiment to prove this idea. The increase of entropy (thermal disorder) is usually recognized by physicists as a reliable marker of the passage of time. Thermal disorder is what we call heat. If I am right, a large amount of local computation should be the same thing as a large amount of local time; and a large amount of time should be correlated with an increase of heat. If you are using a laptop, and it’s actually on your lap, you can feel the heat of computation generating time on your thighs.