Took me a minute to wrap my head around it, that explanation isn't worded that clearly, but then I got it.
That happens because time is a factor in how light from different parts of the object will reach the observer. Light from its far side takes longer and in that time the object continues to move. You can see behind the object, because its rear end moves out of the way of the light coming from itself during the travel time of that light.
How does the rear end move out of the way? Wouldn't it be blocking the light? It's not like the object could move out of the way faster than the speed the light it traveling at (ftl)
That's amazing. I've watched lots of videos about length contraction and I don't think any of them ever mentioned this (the shape of an object moving a near light speed won't change to an observer, it will just appear as if it had rotated instead of being "squeezed" as every video about this seems to imply)!
You rather showed the opposite. While interesting, this video only explains length contraction, not the Terrell–Penrose effect. In this video, the passing spaceship would appear to be rotated to the observer, not just contracted, as, to quote Penrose via Wikipedia, the light from the trailing part reaches the observer from behind the [spaceship], which it can do since the [spaceship] is continuously moving out of its way"
This nascent series on YouTube , Hypercubist Math, sets out to make four dimensions intuitive to our three-dimensions-accustomed brains. Baseline is just basic calculus, which the inaugural video provides in context:
I actually think the opposite is true. The way I've heard it phrased and explained that makes the most sense to me is "everything moves through spacetime at the same rate" - it's basically the clock speed of the universe. It's just that if you move faster in a space dimension that your relative movement in the time dimension slows down.
It only seems weird to us because our senses and minds evolved in an environment where things we can perceive never differ by relativistic speeds.
While I do like that intuitive explanation, it's lacking in describing all other aspects of the universe.
Like, how the energy required for an object with mass to approximate the speed of light in spacial dimensions goes to infinity, even though it's already traveling at that speed through spacetime.
Sure, one simple sentence is not going to explain the universe. But, at least from the simple relativity side of things, essentially everything falls out of (that is, it's a consequence of) that simple sentence. I.e. starting from that you can derive other consequences. E.g. "how the energy required for an object with mass to approximate the speed of light in spacial dimensions goes to infinity" is actually a direct consequence of that statement: every amount of energy you push into an object with mass causes it to accelerate, but due to the essential "clock speed of the universe", that acceleration is less and less as you approach the speed of light, and thus it takes an infinite amount of energy to reach the speed of light. Another way to think of it is that if it took anything less than an infinite amount of energy to reach the speed of light, then the speed of light couldn't be the universal speed limit, because you could add more energy that would accelerate it further.
On the other hand, my understanding is that quantum mechanics is another beast entirely, and one of the biggest problems in physics, and to developing a "theory of everything", is to unify quantum mechanics with general relativity.
Although nothing will explain everything, still it's fine with the first point: increasing the rotation vector of momentum in spacetime increases mass. The rest follows, since you know that the more mass, the more energy required to accelerate still more.
But if you are interested, a significant amount of the basics of quantum mechanics follow directly from Fourier transforms -- which unfortunately are harder to self-study than spacetime rotations.
I came up with a variant of this that extends the motion vector into “matter” dimensions. Then the logical consequence is that the more matter you have, the less of the unit vector is available for movement in time == time dilation due to mass.
Similarly, only vectors with zero length in the matter direction can have unit length in the space/time direction. This is the “only massless particles move at the speed of light” rule.
FWIW there are other answers in that Stack Exchange that, in my opinion, give a better description of the situation, and in my opinion the primary objection in that particular answer is the definition of "move". Fair enough, but I think it's still a helpful description for laypeople who are not fully versed in the math.
Perhaps nit picky, and I know you were joking, but I think this is the wrong way to think about it. It's not that the other behavior is undefined, it's you essentially have all of these functions that use "C" in their definitions, and then you have "#define C ..." in a header file somewhere.
Nah, its real simple. SR just comes about because you want to keep chemistry working the same on a rocket doing 99% of the speed of light as it is at rest.
Working out all the implications becomes very complex.
But then you probably wouldn't have life to observe it if the simple rules didn't have complex emergent behaviors.
A limit to the speed of causality makes physics so much simpler. Without it you'd need to factor in the interaction of every particle with every other particle in the universe.
Don't you need to do that anyway, because of gravity?
Wouldn't the causality speed limit just cause those gravity interactions to arrive with a time delay rather than being instantaneous?
Which means that to simulate the universe you essentially have to keep a history of how gravity is propagating, which requires keeping more information than if interactions were instantaneous?
In a sense perhaps this applies to light too, because since it has a finite velocity now you have to keep track of how all the photons individually propagate through spacetime, whereas if light traveled instantly this would not be necessary?
EDIT: The advantage I see in a speed limit is that you should be able to compute what happens in a point of spacetime based only on the information that is around that point (which still might have come from any or all other particles in the universe, mind you). For me, this emphasizes how important locality must be and it basically converts the popular "spooky action at a distance" claims into nonsense to me.
I guess that's why I'm a fan of the Many Worlds Interpretation.
Since they have a speed limit you have to keep track of all gravity waves associated with all particles of the universe throughout all time and space.
So all particles still interact with all other particles, all the time, it's just that they do it with a time delay.
If there wasn't a speed limit it would be much simpler because all gravity interactions would be instantaneous and you wouldn't have to keep track of gravity waves.
If light acted instantaneously you would have to calculate the effect of it's rays everywhere all at once, which I think is quite expensive given how vast space is.
However, since the speed of light is miniscule compared to the size of the universe you can ignore all but the most local interactions, and just schedule a computation sometime in the future when you know that the light vector will interact with something.
While instant calculations would perhaps make for a simpler system conceptually, the speed limit and locality principle ensures that less processing power is needed (at the cost of a lot of memory).
As someone already mentioned, you would have to account for all the interactions with light -> everywhere at once.
With a effective speed limit to space-time, you can "localize" the computation to the spaces where light has reached. And who knows, maybe light can't travel forever, it might just disappear after crossing some distance we still haven't measured (how we'd do that, who knows).
Giving yet another evidence to the "grand simulation" theory. "The universe" is just a group of simulated worlds connected by interacting photon particles (light).
Right, that's what I meant when I said that the advantage of a speed limit is locality. It allows you to compute the next state of a point in spacetime based only on the points around it.
But my point was that this also makes the universe more complex than an alternative fictional universe where information can be accessed instantaneously across any distance (which still allows for distributed computation, if synchronization or lazy computation is possible).
Possible alternative, but what would be a factor of locality in such an universe? And how would the universe store the infinite "light matter" in it's "memory", since light particle beam being instant means that it has no limits to where it can reach, and will grow depending on distance traveled (which is infinite)?
Some processes that are outside of scope we can sense seems like a too cheap explanation.
We actually don't know and can't really know what simulates the universe.
But we can deduce from various cues that it is being "simulated".
The Double slit experiment is one, experience of deja-vu another, dreams that partly manifest in reality after some time, the apparent speed limit of light, out of body experiences, the fact we are the only local top intelligent lifeform in this part of galaxy, etc...
All signs of processes and memory "bugging out". Except the last one, that one seems to be by design.
You're conflating a lot of things that ought not be conflated
Where are mesoscale "bugs" of cells or the like?
Or is it only on the level of quantum effects and errors of the brain, wherein you're overlaying quantum jargon onto another iconprehensibly complex function?
What’s unique about gravity with what you’re talking about? The Coloumb force also applies between every pair of electrically charged particles, right? And with the same inverse-square function of distance?
Tangential question: is the speed of causality coincidentally the same as the speed of light? Or are they the same because of some underlying principal that inherently links them?
Correct, the speed of light is actually the speed of causality, we just so named it after light because that was what we first discovered as going at c, but many other things in the universe also due because it's the same underlying principle. That is why gravitational waves also travel at c, ie if you removed the sun instantaneously from the solar system, the Earth will continue to orbit for 8 minutes, as that is how long light (and the gravitational force) takes to get from the sun to the Earth.
Not an expert, but can't resist chiming in anyways... One thing to think about it what exactly is causality? There'll be tons of different definitions, but they'll all have one thing in common, events that cause "later" events, and/or events that depend on "earlier" events.
And in a physics sense what is an event? An interaction between two things, right? And since there doesn't exist any force that can interact instantaneously across distance, the speed limit of causality is equal to the speed of our fastest forces.
If we discovered some scifi-esque Tachyon particle that traveled at 2C, we could no longer say the speed of light is the speed of causality.
I'm not an expert either, but I like to think that the speed of causality is the speed at which a piece of information (e.g. a particle, a gravity wave, etc) is traveling through space.
So the speed of light / gravity is essentially the maximum speed of causality, because nothing can travel faster than that.
EDIT: Or you can think in terms of how information propagates through spacetime. In this point of view, the speed of causality is always the speed of light, for everything, including particles with mass.
Yeah that's what I was trying to work towards. Basically that causality is an abstraction, or at least a "higher level" idea. And if we look at the components of it, we can see that interaction between two things is a central part of it. And an interaction between two things in our universe has a maximum bound of the speed of light (and gravity and so on). The speed of causality is just the speed of the fastest thing.
It's the reverse. Light and gravity travel at c because they (at least light) are mediated by particles / systems that have no mass.
Mass is what slows the speed of causality for certain particles. For example, while the photons I emit may travel at C, the massive particles that make up 'me' cannot.
They are the same, because there’s no such thing as the “speed of light”.
Theoretically as I understand it, everything moves at exactly the same speed through space-time, whether it is light, the Earth, etc.
At non-relativistic speeds, this means moving along the time axis at approximately one second per second, with the rest of the movement in space. At relativistic speeds, higher proportions of the “speed” of an object are along the time axis.
As the energy requirements for moving massive objects through space at relativistic speeds are huge, we can only really observe this phenomenon with light, which has no mass, and therefore does not need huge amounts of energy to move through space.
As a result, we call 186k miles per second the “speed of light” when actually it is just the maximum speed anything can travel through space, and due to light being massless, it happens to be the speed that light travels through space too.
This is my understanding as well. All massless particles must travel at the speed of light (they cannot be slowed down) and moreover, they must travel at the speed of light in all reference frames.
Whereas massive particles can remain at rest. There's no such thing as an unmoving photon.
My point is subtly different, I think. The point I’m trying to make is that if you’re “at rest” in space, all of your movement is through time. If you are “at rest” in time, all of your movement is through space.
As far as I understand it, having mass is basically a result of moving more slowly through space.
I'm not a physicist, but AFAIU the speed of travel/causality for light is only the maximum speed of causality because photons have a mass of literally zero.
If photons had non-zero mass they could only travel slower than the maximum speed of causality (which would probably be called speed of gravity rather than speed of light, in this alternative universe).
Frankly, it may very well be KISS because the other options were so much more complex. Or they said that if we put speed of light to constant to make it simple, there were so many unforeseen edge cases because of it. The devil is in the details, perhaps?
That should be "all of physics in 6 lines, two flawed overly simplistic arguments and one crackpot theory (and 18 particles and 27 constants buried in the last two items)".
Our senses are evolved to maximise our fitness function within our immediate reality. There's a view that our senses don't reflect truth so much as evolutionary fitness, which involves both compromises and biases.[1]
Our evolutionary environment for the most part has excluded relativistic effects.
Though that raises the interesting question of what sense perceptions of an organism evolving under such circumstances might be.
________________________________
Notes:
1. Donald Hoffman is the principle proponent of this that I'm aware of: <https://www.quantamagazine.org/the-evolutionary-argument-aga...>. I'm not entirely sold on the hard-line version of his argument; it seems to me that there's a general tendency for adherence to truth to be more parsimonious than outright fabulation, in which the nonessential inaccuracies of the sensing system incur additional costs.
How when going at relativistic speeds, you start to appear to rotate to obervers even if you are going straight - you can even see behind the object!