notes-science-physics-quantumPhysicsAndReleaseConsumeOrdering

i don't have an exact analogy here, just some related ideas.

we know from quantum erasure's effect on quantum computation that quantum computing is more about information, specifically about what is 'known' (which i here define as what could affect events) at each point of possible observation.

consider two measurement events A and B outside of one another's light cone, which yield measured values a and b. We'll also call the people conducting the measurement A and B.

we know from relativity that different observers may observe A and B in different orderings

we know from the Bell experiment that if A and B are measuring an entangled object (and they are freely choosing which of ortho 'attributes' to measure), one of these events can affect the other in the sense that if one had yielded a different measurement, then the other would have yielded a different measurement. (i wonder if the symmetry in this concept of causation is related to the 'symmetry' of the unitarity requirement of the quantum computation matrices?)

note the observation by my friend RF that this notion of 'affects the other' does not imply a linear ordering in time, and that indeed as we noted earlier, by relativity we cannot have a linear ordering.

so we have two events for which there is no ordering. but the universe is okay with that, because it's not like measurer B can get the results of measurement A and make use of that information to affect decisions about which ortho attribute to measure. It's true that later, when measurers A and B get together, they can put the information about which ortho attributes they had chosen together and realize that there was some surprising causation between them. but by that time it's too late for that to affect their choices that determined events A and B.

so now consider 'relaxed' memory ordering in C inter-thread communication. The idea is that the hardware is allowed to reorder reads and writes in any way so long as the order of values read from any single variable is preserved (so note that no ordering structure across variables is required to be preserved):

http://en.cppreference.com/w/c/atomic/memory_order (we're using the example under the heading 'Relaxed ordering' but the same example applies to 'Release-Consume ordering):

" For example, with x and y initially zero,

Thread 1: r1 = atomic_load_explicit(y, memory_order_relaxed); atomic_store_explicit(x, r1, memory_order_relaxed); Thread 2: r2 = atomic_load_explicit(x, memory_order_relaxed); atomic_store_explicit(y, 42, memory_order_relaxed);

is allowed to produce r1 == r2 == 42. "

So, imagine threads 1 and 2 are people.

From 2's perspective, first 2 receives a letter from 1 with the contents '42'. Then 2 sends 1 a letter with the contents '42'. But 2 would have sent that same letter regardless of what 2 had received.

From 1's perspective, first 1 receives a letter from 2 with the contents '42'. Then 1 sends 2 a letter with the contents '42'.

The paradox is that, from 2's point of view, the cause of the letter they received earlier is an action they took later.

This would seem like it would allow a kill-your-grandfather paradox. But it does not; release-consume ordering allows this ONLY because the letter than 2 sent was not dependent on (caused) by the letter they received earlier.

Another metaphor for what happens from 2's point of view: First 2 receives a letter from 1, but 2 does not open the envelope. Then 2 sends 1 a letter with the contents '42'. Then at some point later 2 opens the envelope and finds that contents of 1's letter was '42'.

So now you see the similarity with quantum physics. Not opening the letter seems similar to not observing/measuring the entangled object, and also somewhat similar to the fact that A and B can't pass information via their choice of ortho 'attribute' to measure, therefore 'it doesn't matter' which event occurred first.

In other words, much like a computer executing a multithreaded C program running with Release-Consume ordering, the universe allows itself to leave events unordered as long as there is not a linear causation relation from one to the other.

In fact, one might say that, at the time the object became entangled, its state was chosen based on what A and B did in the future. This would then be a close match to the metaphor in which Thread 1 received a letter whose contents was determined by Thread 1's own future actions, provided that Thread 1 didn't peek inside it until after e acted. In the physics case, however, the 'action' that has to occur 'before peeking' is actually selecting 'which way to peek', e.g. whether to open the envelope by tearing it or with a scissors. So there's no chance that you could 'peek' before the action.

Following the release-consume metaphor, we might say that the underlying computational architecture responsible for computing (simulating) the laws of physics defers the computation of the state of the entangled particle until it is needed when it is measured. This is like choosing non-locality in Bell's theorem, because this result must be 'instantly' propagated between events A and B; we might try to say this is no problem, under the computing architecture metaphor, the difficulty of FTL information transmission is only a within-simulation problem, the computer doing the simulating can take as long as it likes in between producing the result of a computation for two events that are close in time within-simulation. However that misses the point, as in that case the whole underlying computational architecture is unknowable and irrelevant. I would prefer to say, assume that the 'underlying computational architecture' isn't so underlying, and that it does seem like the has a problem with instantaneous information transmission. But it has a lot of time to compute the answer, because A and B are so far away.

Even if this is the best way to think about entanglement, to me it leads to the question: if events in time can be unordered provided there is no linear causation relation, and if time is so like space, then why do the locations of events in space have a total ordering? Shouldn't this be unordered in the same way? I'm guessing that it is, somehow, in general relativity.

Another question: if we imagine that the rules for these things are like the rules for any computer with limited ability to simultaneously transmit information, and that unordered events are preferred by the universe because they give it more freedom to compute e.g. as if it had limited computational power and wanted to optimize, then why is it that when you don't peek inside a quantum computer, it can do things faster? This seems like just the sort of time that a computer would optimize by doing less work, but instead it appears to do more work. So it seems that the notion of what is being optimized is wrong here, and that what the universe hates the most, besides transmitting information really fast, is quantum state collapse. Ideas for metaphors for that: the universe doesn't like having to generate random numbers; state collapse involves branching universes, which is a heck of an expensive copy-on-write; when the state collapses the universe has to synchronize information about which way it collapsed between a bunch of particles, and this synchronization is more expensive for it than just doing unobserved quantum evolution.