Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Gravity is unlikely to be the cause of quantum collapse, experiment suggests (sciencemag.org)
127 points by tdhttt on Sept 7, 2020 | hide | past | favorite | 95 comments


Gravity was never a leading explanation for collapse; it was always a fringe idea.

Gravity is local and apparent collapse is non-local; it doesn’t even pass the smell test.


The first paragraph seems to be pretty much all misleading or wrong:

It’s one of the oddest tenets of quantum theory: a particle can be in two places at once—yet we only ever see it here or there. Textbooks state that the act of observing the particle “collapses” it, such that it appears at random in only one of its two locations. But physicists quarrel over why that would happen, if indeed it does. Now, one of the most plausible mechanisms for quantum collapse—gravity—has suffered a setback.


As soon as it says "here or there" the writer is clearly trying to make it binary again. It isn't "here or there" but a function of probability between those two "places." It's an infinite number of possible "locations," but we narrow it down to a large number of decimal places because otherwise we wouldn't be able to make use of it at all. If things were as written here, you could do all quantum calculations with a straight up coin toss (although some might argue that can with an adequately tiny coin, but that still only gets at the most basic understanding of "spin" and only becomes an accurate analogy if the coin is flipped randomly in zero-g).

Our higher level concept of gravity (at least in the classical sense) has been known to break down to the point that it doesn't apply to quantum mechanics for quite a long time now. The writer should just google "quantum gravity" to discover what a complicated subject that becomes.


Indeed.

The weird thing is that the writer of that article has apparently written a popular science book about quantum mechanics too: https://us.macmillan.com/books/9780374536619


Never underestimate the damage an editor can do to a correctly written piece of prose.


Collapse is only mysterious to people who never learned thermofield theory, or non-equilibrium quantum dynamics. Unfortunately this is also a sizable fraction of all physicists. Thus we have articles like this :/

Basically all the work in getting large scale quantum computers to work lies in these fields. They will continue to grow.


I am not a physicist but isn't it an instance of "if you think you understand quantum mechanics, you don't understand quantum mechanics"?

The measurement problem, which deals with collapse is still unsolved. It is also the reason why there are so many weird interpretations of quantum mechanics and that even top physicists can't agree on one. In fact the same physicists tend to sweep the problem under the rug and instead focus on the equations that, to be fair, did a lot more to science and technology than trying to solve the measurement problem.


There is no measurement problem. It so happens that most measurement apparatus are macroscopic and at a temperature much higher than the quantum gap of the system being probed. Measurements in qm imply a rotation between systems, and mixture with an incoherent thermal state implies a quantum superposition will lose coherence. Full stop. This is all understood in full, grab any text on non equilibrium quantum dynamics. One can even simulate the collapse process in full detail, and obtain superior agreement with experiments.

If the measurement apparatus is coherent, measurements can be performed without collapse. This was well thought out even within wigner and einstein's lifetimes. c.f. the vaidman bomb detector.


This is certainly my view as a physicist, although I'm not the greatest physicist by any measure. QM can give us a perfectly clear picture of "collapse", although it's not a trivial matter and it's hard to explain to laypeople. Especially when physics fans have these ideas about ~conscious observers collapsing the wavefunction~, or something.


It depends on what one calls the measurement problem.

This solves the "consistency/small problem", i.e. treating the macroscopic apparatus as boolean is justified.

It doesn't resolve the "outcome problem", i.e. which outcome is selected. Of course if you accept the world is not deterministic this isn't really a problem.


One could argue even classical mechanics isn't deterministic as we think of it because of chaos, which has fascinating connections with QM. W Hoover (of the Nose-Hoover thermostat fame) did some great work with reversible thermostats exploring the instability of Newtons equations of motion.


I think that's different. Chaos still uses classical probability and the randomness is just ignorance of underlying initial conditions. This is very different from QM.


You'd be surprised. You should read about many-body localization and the eigenstate thermalization hypothesis.


I have. They still don't make Quantum Theory and Chaos similar. Rather for some systems QM can motivate ergodicity as well as classical chaos can. However that doesn't mean Quantum Probability and Classical Probability are alike simply because they give similar behaviour for certain systems for one specific limit. Their representation theory is completely different.

Chaotic systems don't have a Kochen-Specker or PBR theorem.


What do i measure of a quantum system if nothing collapse?

Honest question.


Measurement is a transfer of a state information from a subsystem 'being measured' to a 'measurement device' caused by an interaction. (ie: a quantum bus is a measurement) Just like in a classical situation, further dynamics can depend on the final state in the measurement device. The only difference is that all measurement states will be involved in the quantum measurement device, and all possibilities of dynamics conditioned on the measurement are explored. At any point in the future the whole chain of events can be collapsed, if the measurement device interacts with a classical incoherent object.


The first quote dates to the 1960s before decoherence was understood.


Decoherence doesn't really solve these issues. It gives you an approximately diagonal density matrix for the macroscopic degrees of freedom, but: (a) Not exactly diagonal (b) There isn't a unique decomposition of the macroscopic density matrix even after decoherence thus it cannot be taken as simply ignorance of some set of macrostates.

You need something stronger, namely superselection or irreversibility.


That's correct, But there are numerous good methods to propogate irreversible dynamics in qm. Many of these are exact in the limit of a noninteracting bath of bilinearly coupled oscillators which is sufficient to describe a measurement collapse. There's no mystery or inexact proscription to such a simulation of a collapse process. It's just complicated.


> There's no mystery or inexact proscription to such a simulation of a collapse process. It's just complicated.

Then there are a lot of Nobel prize winning physicists who would love to be enlightened about how simple the mystery actually is.


You are really missing the point - the details of how a measurement happens with specific instruments is not what the measurement problem is about.

The issue is linear evolution means the measurement of a superposition leads to a superposition of measurement devices. If the quantum state is real that gives you many worlds.

If you are suggesting there is nonlinear evolution, well a) it must be non-local and b) the theoretical research suggests it would be inconsistent. QM is a very rigid theory - “an island in theory space”. It isn’t easy to slightly modify.


> The issue is linear evolution means the measurement of a superposition leads to a superposition of measurement devices. If the quantum state is real that gives you many worlds.

And that's problematic because? Because it explains away the whole measurement problem, there is nothing to explain, it's an artifact of a macroscopic observer's point of view?

It's like saying that SR/GR with their space-time continuum being real is problematic, so let's keep to the (post)-Newtonian point of view, but oh no, it now has all this weird amendments and additional terms, and when you try to extend it to the whole of the Universe, it breaks down/gives really weird stuff. Well, duh, of course it does, if one tries to pull an owl on a globe, it simply won't fit.


Well you don't need to have nonlinear evolution to get what alpineidyll3 is saying. It's sufficient for the observable algebra of macroobservables to be commutative. This allows the evolution to be linear and have no interferences.

The QM is "an island in theoryspace" idea isn't strictly true either. QM is one among an entire family of probability theories. It's only rigid when considered purely from the point of view of Probability theories based around vectors in Hilbert space. However considered as part of OPTs in general there's nothing that makes it difficult to modify.


Is it confirmed that decoherence is the answer to collapse? I strongly like the idea of decoherence, but IIRC it still need a few decades to settle.


>Collapse is only mysterious to people who never learned thermofield theory, or non-equilibrium quantum dynamics.

Oh yeah? Then what is the physical explanation for collapse?


What is the physical evidence that collapse actually happens?

Flash news. Nobody has ever produced any.

To the contrast we have lots of lines of evidence that an observer described by quantum mechanics should, upon observing a quantum experiment, be thrown into a superposition of observers. Each of which appears to have observed collapse. The notion is utterly repugnant to our biases so many reject the idea out of hand.

But as we create ever more complex but controlled systems, we can perform ever more elaborate experiments verifying that quantum mechanics works exactly as predicted. At some point if we take seriously the idea that the most successful scientific theory of all time is an accurate description of ourselves, then we have to accept that perhaps there is no collapse after all.


I happen to agree with you, the observer is a quantum system must get entangled with the quantum system, but that still doesn't explain probabilities. If you prepare a system - say sqrt(1/3) spin-down + sqrt(2/3) spin-up, and then observe it, repeatedly, your subjective experience is that you saw spin-down 1/3 of the time, and spin-up 2/3 of the time. I don't understand how purely unitary evolution can explain this. Does it?


> I don't understand how purely unitary evolution can explain this. Does it?

What's the alternative? Assuming unitary evolution and some fairly common-sense axioms about how we'd expect subjective experience to behave (things like: we never experience being in a branch that has amplitude zero; if we experience being in a given branch then we continue to be in that branch), the Born probabilities are the only model anyone's ever come up with for how our subjective experience should go. So what's there to explain?


The alternative is non-MWI theories, which typically introduce the Born rule via new axioms.

Regarding what's to explain, it's quantum randomness (which distills the Born rule objection). Our subjective experience is that we see spin-down 1/3rd of the time, and our theories say the result is otherwise impossible to predict, even in principle. But a deterministic theory cannot produce a random outcome, even a subjective one.


> Our subjective experience is that we see spin-down 1/3rd of the time, and our theories say the result is otherwise impossible to predict, even in principle. But a deterministic theory cannot produce a random outcome, even a subjective one.

Whyever not? What else would you expect the subjective experience of being in a state like 1/sqrt(3)|x> + 2/sqrt(3)|y> to be like?


There's a continuous infinity of alternate basis expansions. There's no reason to think you'd experience along that particular basis.


What would you expect "experiencing along those other bases" to look like? If you expand along a different basis you just get something like: half a chance of experiencing (1/sqrt(3)|x> + 2/sqrt(3)|y>), and half a chance of experiencing (1/sqrt(3)|x> + 2/sqrt(3)|y>), so it amounts to the same thing.


Going from your last post.

> the structure of the wavefunction is that it divides cleanly into those two branches, and that's true in any basis.

It's not. It only has this Schmidt decomposition in one basis. In other bases there will be cross terms among the basis elements. What you're doing is privileging Schmidt bases as ones that give experiences. In another basis with states w,z say the state will be: |w>|w> + |z>|z> + |w>|z> + |z>|w>

So you won't be able to give this clean "experience" reading unless you posit we can't experience in things like the w,z basis here and only in Schmidt bases, but then you run into the problem that for real macroscopic systems they won't admit a Schmidt basis.

This seems like the kind of a "vague" Many Worlds where one doesn't look any deeper than pretending a macro-device is a qubit (e.g. no thermal states etc) and looking at one basis. There's a reason properly developed MWI is nothing like this such as the Spacetime State realism of Wallace and Timpson.

Why one would believe in quantum state realism at all is a separate question.

>Of course you can

No you can't, it's a direct consequence of the Kochen-Specker theorem. If the device is treated quantum mechanically and it enters an entangled state of the form you gave then you cannot perform conditioning as the Kochen-Specker theorem, via the non-uniqueness of Hilbert space orthogonal decompositions, prevents an unambiguous formulation of Bayes's law. I can link to papers proving this if you wish.

The fact that we do experiments where we can condition is, in light of this theorem, a demonstration that our measurement devices do not enter into the kind of CHSH states you're giving.


> It's not. It only has this Schmidt decomposition in one basis. In other bases there will be cross terms among the basis elements. What you're doing is privileging Schmidt bases as ones that give experiences. In another basis with states w,z say the state will be: |w>|w> + |z>|z> + |w>|z> + |z>|w>

The state's evolution will be completely equivalent to (a linear superposition of) the evolution of |x>|x> and |y>|y>. That's a physically observable fact that's independent of your choice of basis (it's less obvious in the |z>/|w> basis, but it's still true).

Any physically valid concept of "experience" would have to behave the same way. If your state is equivalent to a linear superposition of "experiencing x" and "experiencing y" then it can be characterised completely in terms of "experiencing x" and "experiencing y", and that's not dependent on your choice of basis (though it may be easier to see in one basis or another).

> No you can't, it's a direct consequence of the Kochen-Specker theorem. If the device is treated quantum mechanically and it enters an entangled state of the form you gave then you cannot perform conditioning as the Kochen-Specker theorem, via the non-uniqueness of Hilbert space orthogonal decompositions, prevents an unambiguous formulation of Bayes's law. I can link to papers proving this if you wish.

> The fact that we do experiments where we can condition is, in light of this theorem, a demonstration that our measurement devices do not enter into the kind of CHSH states you're giving.

I don't know what you're trying to claim here. All the available evidence is that measurement devices, being ordinary physical objects, follow the laws of quantum mechanics, and that includes conditioning behaving as entanglement; if you've got evidence that that's not the case then a Nobel prize awaits. Non-uniqueness is a red herring, because choice of basis does not and cannot change experimental predictions; the basis exists only in the map, not the territory.


Continuing from the longer post below.

The device has to have its contextual observable algebra develop a non-trivial center, not just be entangled as is mentioned in section 5 of the paper I linked. It's well known entanglement alone isn't enough which again is why entanglement alone has been called "pre-measurement" since the 1980s.

Note how this involves hard mathematics, not vague talk about "obvious features of subjective experience". I'll also note that this is a general feature of discussions about this stuff among non-physicists online, especially programming communities like this one, the knowledge is stuck in the late 1970s.


> The state's evolution will be completely equivalent to (a linear superposition of) the evolution of |x>|x> and |y>|y>. That's a physically observable fact that's independent of your choice of basis (it's less obvious in the |z>/|w> basis, but it's still true).

Of course the state can be written in the form |xx> + |yy>. I never denied that. The point is that it can be written in other forms. So it's equally correct to say we'd "experience" |zz> + |ww> + |zw> + |wz> as to say we'd experience |xx> + |yy> so there's no reason to say we'd "obviously" experience only the latter. Your argument is just "that expansion is always available", but since other expansions are also always available I don't see what the force of this argument is.

Even worse in QFT there isn't an expansion of the form |xx> + |yy> available due to the Reeh-Schleider theorem so your whole construction is moot anyway. Again where is this paper deriving the Born rule from unitarity and basic facts about subjective experience.

> I don't know what you're trying to claim here. All the available evidence is that measurement devices, being ordinary physical objects, follow the laws of quantum mechanics, and that includes conditioning behaving as entanglement.

I'm claiming a consequence of a well known theorem from Quantum Probability. See section 4.2 of this paper https://arxiv.org/abs/1310.1484

Quantum states without superselection (e.g. the entangled states of the form you are considering) leave Bayesian conditioning undefined. As the paper mentions this is a direct consequence of the Kochen-Specker theorem via non-unique orthogonal expansion. It's not a red herring but a rigorously proved theorem.

I don't know what the "Nobel prize" remark is about as it is well known that entanglement doesn't give well-defined conditioning. That's why entanglement with the device alone is called "pre-measurement" in most papers in measurement theory following terminology introduced by Zurek in the early 80s. A good example of the issues with pre-measurement alone is here https://arxiv.org/abs/2003.07464. You can't just treat the device as simply entering some CHSH or GHZ style entangled state and think that solves everything about measurement. It doesn't via the theorem I gave in the paper above (and other issues).


> Of course the state can be written in the form |xx> + |yy>. I never denied that. The point is that it can be written in other forms. So it's equally correct to say we'd "experience" |zz> + |ww> + |zw> + |wz> as to say we'd experience |xx> + |yy> so there's no reason to say we'd "obviously" experience only the latter. Your argument is just "that expansion is always available", but since other expansions are also always available I don't see what the force of this argument is.

If there's a simple description of the wavefunction that's valid then there should be a correspondingly simple description of our experiences that's valid. The fact that there's also a more complicated valid description of the wavefunction is neither here nor there. It's like looking at a basket of 4 apples and asking why your experience doesn't correspond to there being 6 - 2 apples.

> Quantum states without superselection (e.g. the entangled states of the form you are considering) leave Bayesian conditioning undefined. As the paper mentions this is a direct consequence of the Kochen-Specker theorem via non-unique orthogonal expansion. It's not a red herring but a rigorously proved theorem.

Ok, I take your point, saying that we can just condition is overly flippant: if there are cross terms (i.e. entanglement) then classical conditional probability doesn't always accurately describe the behaviour of a system, and of course that's true for a system that includes experimenters inside it. But if we treat an experimenter's conditioning as creating entanglement, like any other QM interaction, and treat the subsequent evolution of the system quantum-mechanically, then there's no problem.

> A good example of the issues with pre-measurement alone is here https://arxiv.org/abs/2003.07464. You can't just treat the device as simply entering some CHSH or GHZ style entangled state and think that solves everything about measurement. It doesn't via the theorem I gave in the paper above (and other issues).

That paper amounts to nothing more than redefining "outcome" as something that cannot be in a superposition, and then using this to argue that it makes their unfounded notion of decoherence physically meaningful. If we assume that experimenters are physical systems that can undergo superpositions like any other, then of course Bell-style "no hidden variables" results apply when those variables are the outcomes of experiments. Big whoop. (Would you find the following argument convincing: "Pre-measuring the polarisation of the photon might have one of two possible results, so it doesn't have an outcome according to any reasonable notion of "outcome". Therefore if any observer has measured a photon's polarisation, a physically meaningful process of decoherence must have occurred"? Put like that it's hopefully obvious that this is nothing more than asserting the primacy of the Copenhagen interpretation).

> Note how this involves hard mathematics, not vague talk about "obvious features of subjective experience". I'll also note that this is a general feature of discussions about this stuff among non-physicists online, especially programming communities like this one, the knowledge is stuck in the late 1970s.

Look, I'm not a big fan of credentialism, but I do have a master's in this from a reputable institution. If working physics has found a compelling argument that there's something mysterious about measurement or experience, then that knowledge hasn't made its way as far as even taught postgrad courses, yet alone the wider public, and the blame for that has to rest with the physicists. (I rather suspect that there's no such argument that has reached any significant consensus among working physicists, and that that the "late 1970s" view in the public sphere reflects that).


Those are the same states so I'm not sure what you mean.

The point is that there is no reason to select out any particular basis over another. You can't just retreat into "well this is the only basis I can experience" because the human sensory apparatus would be able to select out a range of bases in a full unitary account and also the ambiguity of basis decomposition means you can't perform conditioning which we do all the time in experiments.


> Those are the same states so I'm not sure what you mean.

I mean that if you decompose along a different basis than experiencing x/experiencing y, you just get an ensemble of states each of which is a superposition of experiencing x and experiencing y. So you end up with the same thing.

It's like looking at an entangled state (because that's exactly what it is) - if we have a two-particle state like 1/sqrt(2)(|x>|x> + |y>|y>), that behaves like the first particle being in |x> and experiencing the other particle being in |x>, or being in |y> and experiencing the other particle being in |y>, and it might look like that's an artifact of this particular basis decomposition, but it actually isn't - the structure of the wavefunction is that it divides cleanly into those two branches, and that's true in any basis.

> You can't just retreat into "well this is the only basis I can experience" because the human sensory apparatus would be able to select out a range of bases in a full unitary account

A system that's freely interacting will become entangled; whatever we consider ourself is constantly interacting with the rest of ourself, almost by definition.

> also the ambiguity of basis decomposition means you can't perform conditioning which we do all the time in experiments.

Of course you can, and it works exactly the way you'd expect - we already do experiments where some isolated apparatus inside the experiment does something if it detects one thing and something else if it detects something else. Choice of basis is a tool for understanding the wavefunction, not a physically real thing.


See my reply above. You're just declaring we only experience Schmidt bases for no particular reason. Where are you getting this "clear connection" between experience and the decomposition in one particular basis. Do you have a reference?


Derivations like that don't work, you've just declared it by fiat but there's no such proof that is known to work.


There's no proof that it's the only possible way - but no-one's ever been able to come up with a concrete alternative.


There are other alternatives such as the derivation of quantum theory within the GPT framework and many other axiomatic derivations.

I've never seen the Born rule derived from unitary evolution and axioms for how subjective experience should work, so I don't even see this as one of the ways.


To paraphrase you, how do we get from probability amplitude to observed frequencies if there is no collapse?

This is were we have to invoke philosophy. Specifically how does consciousness interact with time? The common-sense thinking is that our soul is tied to our body and is traveling forward through time with it. Another way of thinking is that the soul is tied to a given position of the space-time-probability. It does not travel. You today is not the same as you tomorrow or yesterday. The you that observes spin up is not the same you as the one that observes spin down. Your soul is perceiving reality from a randomly chosen vantage point among all the possibilities with which have a compatible body. If we condition on those bodies belonging to experimenters who have observed frequencies, then we get the distribution.

This is one possibility anyway.


No it can't. There have been many attempts and they don't work. The Born rule is independent of unitary evolution. The closest one can get is to declare that the quantum state is fundamentally a statistical object (i.e. the only information in it is observation probabilities) and then with certain assumptions about the size of the state space you can show that the Born rule is the only possible rule for connecting the state to statistics consistent with the unitary dynamics.

So under the assumption that the state encodes probabilities, state space assumptions and consistency with unitary evolution you get the Born rule. However this is not the same as the Born rule arising dynamically from unitary evolution alone.


Isn't your subjective experience just one probabilistic eigenvalue of a particular combination of operators corresponding to your observation? How does unitary evolution break down here?


It's not unitary evolution breaking down, just that the Born rule isn't a consequence of unitary evolution. They're separate independent hypotheses. In most derivations of QM from an axiomatic basis they're consequences of separate combinations of axioms.


Thanks. Do you by chance have a good source for a gentle introduction into axiomatic QM? Like undergrad level is fine, I've taken basic QM and worked through Griffith's intro book on my own, and I've had a lot of math.

I'd love to read more but my google results aren't turning up a good definitive introduction.


Well in the most common family of interpretations "collapse" isn't an actual physical process, just Bayesian updating. So you wouldn't expect to find physical evidence of it in that sense.

It's true that from the perspective of an external superobserver the quantum state evolves to contain terms for each observer observation state. However since all interference observables turn out to be non-physical for macroscopic systems we get a superselection rule and so the probabilities for different macrostates are classical probabilities and thus reflect simple ignorance of the observer's post measurement state.

There's very little motivation for reading the quantum state "ontically" in the way you are doing.


But this doesn't answer the question. If you claim that all of these possible observers 'exist', how does this have a physical meaning?

This is what I never understood about MWI, in what physical sense can the many worlds be said to exist? Where are they in our universe? What direction would we have to travel to find them? Do they exert gravity on us? If not, then how can we claim that they exist in a physical sense?


No you're thinking of MWI all wrong. Your conception of the universe you exist in as being non-quantum is fundamentally flawed. The universe with the superposition of all the possible observers exists more purely in hilbert space. Sean Carroll has even started to put together a model for how spacetime could emerge from that hilbert space.

So the universes all exist in the same place, since they are the same universe. Your idea of what an observation is, is just an eigenvalue of that corresponding operator.


An object that moves far enough away from us is said to leave the observable universe, because with the continual expansion of space, it or anything it interacts with would have to travel faster than light back toward us in order to have any effect on us. Should we say that objects that leave the observable universe continue to exist? Should we amend our theories to include a new fall-off effect separate from gravity that says things stop existing when they exit our observable universe?


Existence isn't based on something affecting our world, obviously - that's just absurdly self-centered.

But anyway, the other worlds do effect our world - that's why we get interference patterns in double slit experiments.


> "Existence isn't based on something affecting our world, obviously - that's just absurdly self-centered."

This is very unfair. This is a niche field with contested interpretations, don't make people feel stupid for asking fair questions.

It's obvious what the other person meant: what does 'our world' and 'other worlds' mean, and how do you know it's not just a figment of your imagination, as a scientific theory must be falsifiable -> i.e. measurable and provable / disprovable somehow.

You should at least point people to reading material before making fun of them.


I wasn’t making fun of anyone but your “obvious“ reading seems wrong.

I’m pretty sure he suggested that something only exists physically if it has some measurable effect on us.


In physical terms, we do generally define existence that way - for example, we say that time and space didn't exist 'before' the big bang, because there was nothing that could have a position or change. I was thinking of the same notion of existence and how it can be applied to MWI - essentially existence in the physical sense must mean that something is measurable, that it has some effect on the world (perhaps in the past or in the future).


I don't think we do define existence that way. Say you and your friend both go to opposite ends of the visible universe; due to inflation you'll never be able to communicate again.

I suspect most people would say their friend continues to exist. This is very analogous to the many worlds situation.


Thinking about the extreme distances and time spans that entails makes it difficult, and of course relativity has its own "unreasonable" results. Still, they do exist in your past, and they also can assign coordinates in space-time to your current position, even though they are outside your light-cone. On the other hand, you can't meaningfully speak of them existing "now" in relativity, as there is no consistent definition of what "now" means for observers that are space-like separated.

I guess the best answers about MWI is that the other versions of these particles continue to exist at different coordinates in Hilbert space, and that they do interact with each other in observable ways, such as the interference patterns in double-slit experiments.


Speaking as a barely informed enthusiast, we can say they exist in the Occam’s Razor sense that the maths is much less complicated when we assume they do.

I think there’s also an experimental setup, whose name I forget, but which is essentially nested Schrödinger's cat setups: Alice is in a box, Bob is in a box which contains Alice’s box, Carol is outside; Alice goes into superposition of |Alice+> and |Alice->, Bob opens the box and Carol can now demonstrate that Bob is in a superposition of |observing Alice+> and |observing Alice-> instead of the combination of 100%|observing> and a superposition of |Alice+> and |Alice->.


The maths is the same whether we interpret it as many worlds, wave function collapse, and others.


Well of course. It they were different — or at least if they gave different conclusions — we could rule some of them out.


Different worlds don't exist in extra space or dimension. They are orthogonal quantum states of the whole universe.


Wave function collapse is not experimentally verified or observed physical phenomenon (so far). It's postulated by some interpretations of QM.

Apparent wave function collapse can be explained using quantum decoherence.


Yes, but at least a in principle testable fringe theory. (And by penrose!)


For the past several decades Penrose has been a fountain of nothing but frivolous fringe theories. Physicists are like investments: past performance is not indicative of future returns ;P


As long as they are testable, it is good enough.


I agree. There are a (very) few experiments that show the effect of gravity in a system with strong quantum effects.

Fox example you can split a ray of neutrons, direct each beam through a different path with different height and then make them collide and see the interference pattern. The idea is that the split creates a superposition and each half has a different gravity potential, changing the orientation of the experiment produce different interference patterns. (The details are in the book of Sakurai "Modern Quantum Mechanics" pp127-129, with data from an experiment of Colella, Overhauser, Werner (1975).)

I don't understand why the old experiment was not enough to falsify this theory.


>Gravity was never a leading explanation for collapse; it was always a fringe idea.

Why not? Gravity is (probably) mediated by a particle, and therefore all matter will interact with all other matter ... so why shouldn't gravity therefore cause a collapse?


(Nobody is sure about the graviton, but IMHO it is a good bet.)

Electromagnetism is also mediated by particles (photons) and the quantum states can survive a lot of electromagnetic interactions without collapsing. One of my favorites https://en.wikipedia.org/wiki/Stern%E2%80%93Gerlach_experime...


My second sentence?


What percentage of practicing physicists actually think there is something "missing" in the quantum formalism as it relates to "wavefunction collapse"?

I'm not a physicist, but from the few years of QM I took in college my take is that there is nothing special about "measurement", it's just a label we apply to certain states becoming entnagled. As long as you don't believe there is anything magical about humans or other "conscious" observers, then there doesn't seem to be anything to figure out about collapse.


> then there doesn't seem to be anything to figure out about collapse.

Then why is quantum state evolution seemingly continuous and unitary some of the time, and sharply discontinuous at other times?


It seems continuous and unitary when the experimenter takes a lot of care to create a simple and very isolated system. When it's brought into contact with the mess that is the rest of the world it changes character very quickly, and interactions with the rest of the world completely swamp the internal dynamics.


Measurement is not just the same as entanglement. If you try that you get paradoxes and you don't match actual observations.

If you treat a measurement as entanglement then by the Kochen-Specker theorem you can't condition on the measurement outcome. However we seem to be able to do this in actual experiments.

Thus measurement is not entanglement alone, but also the elimination of other bases.


I interpret KS backward from what you're saying. From my understanding, KS says that there is no sense in which the experiment involves an objective revelation of hidden state.

After the experiment, the experimenter did not "learn" or "reveal" some objective fact about reality, ie that the true state was UP rather than DOWN. Instead, after the experiment the experimenter becomes entangled with the UP/DOWN system in such a way that the experimenter measured both UP and DOWN, but all observables relating to the experimenter are either wholly consistent with UP or wholly consistent with DOWN.


The lack of noncontextual hidden variables is the main implication of the Kochen-Specker theorem from which the inability to condition follows. It's not "backward" from the conclusion of no noncontextual hidden state, it's just another consequence there of.

So the Kochen-Specker theorem says there was no pre-existent noncontextual state for the particle. However that doesn't in any way imply the particle was both UP and DOWN or that the device measured both UP and DOWN. Especially for the device as the Kochen-Specker theorem is proved in a context where observable outcomes are assumed to be single-valued.

However in real practice we can condition on the states of our devices following measurement, hence they don't seem to be susceptible to a Kochen-Specker result when viewed as the system for some "super"-observational device. Which they would be if they simply entered an entangled state. Thus the assumption of measurement as simple entanglement does not match actual observed reality. This is a point made in many texts such as those of Schlosshauer, Omnès, Peres and at a very rigorous level in the theory of C* algebras and Category theory by Fröhlich and Landsman respectively.


In the lab we do measure things and we do observe wavefunction collapse. There is no satisfying explanation why "us becoming entangled with the quantum system" or whatever explanation one chooses looks like collapse to us.


I've always been content to know that after measurement, the quantum system that includes both me and the device has a certain quantum state which is a superposition of all the measurement eigenstates, but by construction that state couldn't be measured in such a way that I would have ever experienced an inconsistency.



Nice. It's an actual experimental result.


I don't understand why we would think wave-function collapse would release radiation.


Lazy evaluation.



It does seem like some aspects of QM point to reality being simulation that tries to avoid the expense of unnecessary calculations.


There's no work QM is saving: it doesn't delay work or get to do less until when you physically look at something and then get to skip work for things you don't look at. * Things that are in superposition would require more computation for all their many alternate versions (most of which aren't meaningfully interacted with), not less, and the extra work for managing superpositions continues forever if MWI is right. All particles are constantly entering new superpositions, not just individual particles in fancy lab experiments. Only the very simplest of particle interactions (like two individual hydrogen atoms interacting) are feasible to simulate with full quantum mechanics on classical computers. * "Observation" happens through any physical interaction at all, including particle collisions. Outside of some individual particles inside carefully controlled experiments, most particles end up transitively interacting with most other particles near them.

If the universe was at all optimized for simulation costs for human experience, we probably wouldn't expect there to be be trillions of galaxies with hundreds of millions of stars each, for the smallest particles to be on the scale of billion-billionths compared to humans, or for QM to work anything like it does.


That time-steps for atomic interactions need to be picoseconds or smaller, but interesting biological reactions take minutes+ [eg,protein folding] is also a PITA.


QM-as-RNG for a simulated universe has been one of my favorite "for fun" pet theories over the years. Finding whether the electron went through slit A or slit B during the double-slit interference tests just feels a bit too much like running a Monte Carlo sim to me. I know it's just my human brain really trying to find an explanation that makes rational sense but it's a fun thought experiment.

I mean, quantum fluctuations during inflationary expansion are (in my understanding) what's responsible for the tiny differences in mass distribution that led to the eventual formation of gas clouds/stars/galaxies. The negative gravity during expansion worked to keep the matter in an extremely low-entropy (highly ordered) state, but those unfathomably small quantum fluctuations were blown up by the same unfathomably massive proportions as everything else.

Universe needed that RNG.


Quantum computers are asymptotically more powerful than classical computers. This means that our world is actually harder to simulate (rather than easier as you seem to suggest) than a classical world would be.


indeed :)


Every measurement (gravitational, electronic, magnetic etc. ) causes a collapse of the probabilities. This has nothing to do with quantum mechanics. This is simply statistics/math and has nothing to do with physics.

A simple example: As long as one doesn't look at a coin, the probability that it shows head or tail is 50%. After the measurement it "collapses" to 100% for one of the options.

But the "collapse" is only a mathematical "collapse", not a physical one.

Physics only limits how precise and fast your measurement apparatus can be.


"Now, one of the most plausible mechanisms for quantum collapse—gravity—has suffered a setback." no it hasn't: stop trying to inject drama where there is none.


[flagged]


Which model do you prefer? Both special and general relativity are clearly much better models of reality that Newtonian physics, even if they break down under extreme conditions.

You're right that string theory and super symmetry are dubious, but it's not like they're anywhere close to being accepted.


> Tesla was right about Einstein. He was a fuzzy hair crackpot and his mathematical model for the Universe is riddled with contradictions yet the media continually pushes this guy. Why?

Why?

Because it works for a lot of stuff. Example:

We wouldn't have working GPS without.

But let me admit that if you have a better model I'm all for that.


I read it "Gravity is unlucky"


Why in the hell I get downvoted for this comment?

The world is full of freaks.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: