Quantum Level Collapse

Started by james03, December 16, 2024, 03:43:28 PM

Previous topic - Next topic

cgraye

Quote from: james03 on January 16, 2025, 08:37:28 AMThese are engineering solutions, arrived at by ignoring QM.  For the sake of argument, let's assume Barandes is correct.  There's no waves, no superposition, no entanglement, and no uncertainty principle.  Theory would align with observation and practice, and we would advance quicker and likely make some major breakthroughs.

All engineering happens by ignoring what physics tells you is impossible, making it anyway, and then letting the physicists come in later and explain why physics doesn't tell you it is impossible after all.  (At least, this is how us engineers would describe it!)

The only thing I would say in response to this is that nothing about the current formulation of quantum mechanics requires you to think about waves, superposition, or entanglement as physically real.  And you can be sure that experimental particle physicists, for example, aren't thinking about electrons as waves (or, in accordance with the more general quantum field theory, excitations in fields) when thinking about slamming them into each other in particle accelerators.  But the uncertainty principle is going to be there however you formulate quantum mechanics, stopping you from doing or knowing some things.

QuoteThen the first problem is solved: Space is not expanding, or better, nothing is expanding.  Relative velocities of 2c should be possible, assuming the objects go in opposite directions from a common starting point.  Of course they could never observe each other as they'd be outrunning the photons.

Next up: Nothing is warped.


I still don't understand what you mean when you say things like "nothing is expanding".  You have this idea that "space" is "nothing", but "space" in this context just means "the distance between galaxies", and that is expanding.

Relative velocities of 2c are not possible, and those experimental physicists who smash particles together in colliders will be the first to tell you that - no matter how how much energy you put into accelerating them, you will never get a relative velocity greater than c.  You can get as close as you like to c, but massive particles will never get there.  But relative velocities are local - the relative velocity between two objects passing right by each other.  That doesn't apply to relative velocities of distant objects when gravity is involved.  There isn't even a unique way to define the relative velocities of distant objects in general relativity.

james03

QuoteThe only thing I would say in response to this is that nothing about the current formulation of quantum mechanics requires you to think about waves, superposition, or entanglement as physically real.

True. That's Barandes's position.


QuoteBut the uncertainty principle is going to be there however you formulate quantum mechanics, stopping you from doing or knowing some things.

Here we disagree.  It is not a statement about the precision or repeatability of a test/observation, i.e. no where is the accuracy of the instrument referenced.  It is supposedly a fundamental property of reality.  Since it is derived from the wave function, it is just a mathematical artifact and should be ignored.

QuoteI still don't understand what you mean when you say things like "nothing is expanding".  You have this idea that "space" is "nothing", but "space" in this context just means "the distance between galaxies", and that is expanding.

This is the fundamental belief of modern cosmology.  Yes, the observed metric between the objects is expanding.  That's because they are moving apart and have a relative velocity between them with opposite directions.  While the cosmologists believe that this velocity might be present in some measure, the bulk of the redshift is due to the actual "space", whatever the heck that is, is "spreading".

QuoteRelative velocities of 2c are not possible, and those experimental physicists who smash particles together in colliders will be the first to tell you that - no matter how how much energy you put into accelerating them, you will never get a relative velocity greater than c.

C is a maximum for a photon from the point of origination. 

Now the scientists who fire off particles at high speeds, let's say 0.7c, will never admit that the particles collided at a speed of 1.4c.  Instead they'll talk about "energies".  However you can use the mass of the particle and the relative velocity between particles of say 1.4c, and get the identical result.  With regards to photons,  Planck's constant is just a way to hide the mass of a photon. 
"But he that doth not believe, is already judged: because he believeth not in the name of the only begotten Son of God (Jn 3:18)."

"All sorrow leads to the foot of the Cross.  Weep for your sins."

"Although He should kill me, I will trust in Him"

james03

Consider red shift to just be the "slowing down" of the photons due to the relative velocity between the source and the observer.  If you slowed the photon to 200 ft./sec., the redshift would be massive.  Note if you set your point of reference as the emitter, and gave all the velocity to the observer, you would still observe the photon traveling at c and the observer receding at (c-200) ft/sec.  This is Galoleian relativity.

It is interesting that someone invented the concept of "tired light" that "lost energy" to explain the observations.  It's just another way to say the velocity of the photon in reference to the observer is < c. 
"But he that doth not believe, is already judged: because he believeth not in the name of the only begotten Son of God (Jn 3:18)."

"All sorrow leads to the foot of the Cross.  Weep for your sins."

"Although He should kill me, I will trust in Him"

cgraye

Quote from: james03 on January 18, 2025, 12:29:51 PMHere we disagree.  It is not a statement about the precision or repeatability of a test/observation, i.e. no where is the accuracy of the instrument referenced.  It is supposedly a fundamental property of reality.  Since it is derived from the wave function, it is just a mathematical artifact and should be ignored.

But the wavefunction encodes things about reality, even if it itself is not physical.  On that model, position and momentum operators (for example) do not commute, leading to that particular interpretation of uncertainty.  Formulating things as Barandes does doesn't eliminate it.  He gives an account of the physical meaning of uncertainty starting on page 18 of the paper I linked to back on page 3.

QuoteThis is the fundamental belief of modern cosmology.  Yes, the observed metric between the objects is expanding.  That's because they are moving apart and have a relative velocity between them with opposite directions.  While the cosmologists believe that this velocity might be present in some measure, the bulk of the redshift is due to the actual "space", whatever the heck that is, is "spreading".

For my money, this is the biggest misconception in physics.  I know you will see this stupid explanation everywhere in popular media and on the Internet, but cosmologists are not saying that the motion of objects through space and the expansion of space are different physical phenomena.  Those things are not invariants and thus have no distinct physical meanings.  The "expansion of space" means nothing more than that distant galaxies are moving apart from one another.  Yes, this is generally expressed as the galaxies remaining in place relative to the metric and the metric scaling up.  But this is just a convenient choice of coordinates.  You could just as easily have a metric that doesn't scale and have the galaxies moving through it.  How you classify the redshift is just a consequence of your choice of coordinates.

QuoteC is a maximum for a photon from the point of origination. 
Now the scientists who fire off particles at high speeds, let's say 0.7c, will never admit that the particles collided at a speed of 1.4c.  Instead they'll talk about "energies".  However you can use the mass of the particle and the relative velocity between particles of say 1.4c, and get the identical result.  With regards to photons,  Planck's constant is just a way to hide the mass of a photon. 

The particles can collide at a speed of 1.4c from the perspective of an observer watching them come in from opposite sides.  Photons have no observed mass.  If they have a mass, it is extremely small, and that wouldn't really change anything except that they will be traveling at slightly less than c.

james03

QuoteBut the wavefunction encodes things about reality, even if it itself is not physical.  On that model, position and momentum operators (for example) do not commute, leading to that particular interpretation of uncertainty.  Formulating things as Barandes does doesn't eliminate it.  He gives an account of the physical meaning of uncertainty starting on page 18 of the paper I linked to back on page 3.

I'm falling back to my experience with Laplace transforms.  We never tried to interpret anything in whatever space it took place, we just wanted the answer.  And like QM, "encoding" takes place with the real world 4D model of reality.  So if we screwed that up, the Laplace transforms would be screwed up.

Barandes creates a model of Environment, Subject, and Observer/Measurement, and considers the observer as part of his stochiastic model.  Theoretical QM does not.  So Barandes's approach will lead to adding observation variance to his stochiastic model, which will be incorporated into Hilbert Space, and thus you will get a realistic uncertainty.  It also shows that as technology improves, uncertainty will improve.  If people in the 50's had used his approach, they never would have predicted the MASER would fail.

QM does not treat uncertainty as dependent on the technology, and instead ends up with an absolute uncertainty based on N-dimensional imaginary space.  This is incorrect.

QuoteFor my money, this is the biggest misconception in physics.

It seems like a lot of theoretical physicists take this view.  If you don't take this approach, how is Einstenian relativity different from Galolean Relativity?

QuoteIf they have a mass, it is extremely small, and that wouldn't really change anything except that they will be traveling at slightly less than c.

My view is that photons have mass and their speed from the frame of the emitter is c by definition.  I think eventually we will be able to mechanistically figure out where c comes from, and it will depend on the properties of electrons and the mass of the photon.

We will also end up ditching Maxwell, or thinking about his calcs as describing discrete pulses.
"But he that doth not believe, is already judged: because he believeth not in the name of the only begotten Son of God (Jn 3:18)."

"All sorrow leads to the foot of the Cross.  Weep for your sins."

"Although He should kill me, I will trust in Him"

cgraye

Quote from: james03 on January 20, 2025, 12:29:33 PMBarandes creates a model of Environment, Subject, and Observer/Measurement, and considers the observer as part of his stochiastic model.  Theoretical QM does not.  So Barandes's approach will lead to adding observation variance to his stochiastic model, which will be incorporated into Hilbert Space, and thus you will get a realistic uncertainty.  It also shows that as technology improves, uncertainty will improve.  If people in the 50's had used his approach, they never would have predicted the MASER would fail.

QM does not treat uncertainty as dependent on the technology, and instead ends up with an absolute uncertainty based on N-dimensional imaginary space.  This is incorrect.

How does any of that open the way for beating the uncertainty principle with technology, though?  I don't see anywhere Barandes addresses this.  There are still things in his formulation that are fundamentally unobservable.

QuoteIt seems like a lot of theoretical physicists take this view.  If you don't take this approach, how is Einstenian relativity different from Galolean Relativity?

Because the former obeys the Lorentz transformation and the latter obeys the Galilean transformation.  Einstein will still have a local speed limit c, different observers measuring distance and time differently, velocities that do not simply add, etc.  The two do not predict the same things (though Newtonian physics can still predict an expanding universe).

james03

QuoteHow does any of that open the way for beating the uncertainty principle with technology, though?  I don't see anywhere Barandes addresses this.  There are still things in his formulation that are fundamentally unobservable.

There is always variance and psuedo-randomness.  We'll probably never reach absolute zero due to all of the neutrinos flying around.

The problem is the ontology.  Heisenberg derived his "hard limit" in the imaginary world of n-dimensional Hilbert Space.

I agree that there will always be uncertainty, however the math used to calculate it must incorporate the ontology of the real world.

QuoteBecause the former obeys the Lorentz transformation

You get something like a Lorentz transformation with Galilean relativity.  If L is wave length, e is emitter and o is observer, then from the observers frame you get something like Lo = Le/(1 - (Ve/C)).  As the emitter recedes at a velocity closer and closer to c, the wave length becomes infinite.

However with Galileo, the ontology is correct.  It has nothing to do with curvature or expansion of "nothing".  Lorentz had an ontology based on aether, and Einstein has no ontology.

Normally this is not a problem, as coordinate systems, etc... are only immaterial/spiritual.  They are not materially real.

But there is a problem, and that is with a Michelson dark body, which most people call a black hole.  If you believe in warped nothing, and that c is constant in all reference frames, then in order for a photon never to reach you, you will need infinite curvature.  Infinite curvature is reached as you approach a "point".  Thus black holes are "point masses".  That's a problem, because then you end up with division by zero.

Whereas a Michaelson dark body just needs sufficient density such that the gravity is greater than the escape velocity of the photon, and this density is finite.
"But he that doth not believe, is already judged: because he believeth not in the name of the only begotten Son of God (Jn 3:18)."

"All sorrow leads to the foot of the Cross.  Weep for your sins."

"Although He should kill me, I will trust in Him"

cgraye

Quote from: james03 on January 22, 2025, 10:59:09 AMThere is always variance and psuedo-randomness.  We'll probably never reach absolute zero due to all of the neutrinos flying around.

The problem is the ontology.  Heisenberg derived his "hard limit" in the imaginary world of n-dimensional Hilbert Space.

I agree that there will always be uncertainty, however the math used to calculate it must incorporate the ontology of the real world.

But here you're going back to sweeping all the important issues under the rug of experimental uncertainty, and by "ontology" you seem to mean "classical physics".  But whatever is going on here, even in Barandes' stochastic formulation, is not and cannot be classical physics.  There are simply different phenomena in play here, no matter what the formulation or the ultimate answer.

QuoteYou get something like a Lorentz transformation with Galilean relativity.  If L is wave length, e is emitter and o is observer, then from the observers frame you get something like Lo = Le/(1 - (Ve/C)).  As the emitter recedes at a velocity closer and closer to c, the wave length becomes infinite.

However with Galileo, the ontology is correct.  It has nothing to do with curvature or expansion of "nothing".  Lorentz had an ontology based on aether, and Einstein has no ontology.

The ontology of the aether?  Something which is either totally undetectable and/or has impossible mechanical properties?  That's not much of an ontology.  Now maybe there is such an aether.  But why would it appear in your theory of you can never interact with it?  You can put it in, but it will just drop out immediately.  Likewise in General Relativity, if you don't like working in curved spaces, you can frame the entire thing in a flat background.  But since you can never observe this, and it doesn't affect anything you can measure, it just drops out of your equations immediately.

QuoteNormally this is not a problem, as coordinate systems, etc... are only immaterial/spiritual.  They are not materially real.

But there is a problem, and that is with a Michelson dark body, which most people call a black hole.  If you believe in warped nothing, and that c is constant in all reference frames, then in order for a photon never to reach you, you will need infinite curvature.  Infinite curvature is reached as you approach a "point".  Thus black holes are "point masses".  That's a problem, because then you end up with division by zero.

Whereas a Michaelson dark body just needs sufficient density such that the gravity is greater than the escape velocity of the photon, and this density is finite.

No one actually thinks that there can really be points of infinite density.  But since we cannot observe those points, we don't have any idea what actually happens there.  And this isn't surprising at all, because other forces that General Relativity does not attempt to describe are in play at that scale.  But General Relativity does predict effects we can observe that classical mechanics does not (e.g. the observed values of gravitational lensing, the observed orbit of Mercury, the effect of the Earth's rotation on satellites in its gravity, etc.).  There isn't any kind of dubious ontology here, or even a different ontology than classical physics.  It's just that the relationships between quantities that we can measure are more complex that we would know from the scales at which we live our everyday lives.

james03

QuoteThe ontology of the aether?  Something which is either totally undetectable and/or has impossible mechanical properties?  That's not much of an ontology.  Now maybe there is such an aether.  But why would it appear in your theory of you can never interact with it?

Poor writing on my part.  LORENTZ's ontology is based on aether.  "My" ontology is based on Galilean relativity with regards to cosmology.  Einstein doesn't have an ontology.  He grabbed Lorentz's formula but ditched the aether.

And if you don't have expanding "space", you are going to have a problem with explaining red shift, since the velocity of light with reference to the target/observer is required to be "c".  If you reject constant "C" in all reference frames, then red shift is easy to explain, as I showed in the formula.
"But he that doth not believe, is already judged: because he believeth not in the name of the only begotten Son of God (Jn 3:18)."

"All sorrow leads to the foot of the Cross.  Weep for your sins."

"Although He should kill me, I will trust in Him"

cgraye

Quote from: james03 on January 24, 2025, 11:23:10 AMPoor writing on my part.  LORENTZ's ontology is based on aether.  "My" ontology is based on Galilean relativity with regards to cosmology.  Einstein doesn't have an ontology.  He grabbed Lorentz's formula but ditched the aether.

Even if that holds, it doesn't hold at non-cosmological scales, which is why I brought up more local examples.  The symmetry present in standard cosmology allows for some convenient and intuitive things, such as a time coordinate that corresponds to proper time for all observers.  But we need to be careful not to read anything ontological into that, because it doesn't work in more normal, local systems where that symmetry is not present.

QuoteAnd if you don't have expanding "space", you are going to have a problem with explaining red shift, since the velocity of light with reference to the target/observer is required to be "c".  If you reject constant "C" in all reference frames, then red shift is easy to explain, as I showed in the formula.

Explaining the redshift without the expansion of space is not only not a problem, it's far more natural to both our ordinary intuition and within the framework of General Relativity - the explanation is the Doppler effect, just as it is for closer galaxies at non-cosmoslogical distances.  Things are moving away from us, so the light emitted from them is Doppler shifted.  Again, whether you want to classify redshift as kinematic, gravitational, or cosmological is entirely down to what coordinates you choose - these are not distinct physical phenomena.

The idea where everything is not moving but the space in between them is expanding is a useful mathematical model, but it doesn't really make any physical sense to us, so it doesn't make any sense to explain things to non-experts this way.  The explanation that makes sense is the one that matches how we think about the physical phenomena in play - the one where objects are moving away from each other.

And within the framework of General Relativity, the basic underlying idea is that each point in space can be considered locally flat, so it makes the most sense to think of a Doppler shift at each of those points.  Thinking about something like the wavelength of a photon expanding with space doesn't even make sense within the physics of light.  Maxwell's equations have no "expanding space" term or references to the size of the universe.  This is just not a good explanation.

And again, I can't emphasize enough how big of a problem this reification of the idea of "expanding space" is within physics.  It permeates all the popular media, of course, but this is present in undergraduate astronomy textbooks as well, and it leaves students who don't go on to specialize in General Relativity with a wrong idea that they carry with them from that point forward.  Everyone talks about it this way, but it makes no intuitive sense, and it's in tension with the fundamental ideas of General Relativity.

james03

Missed this:

QuoteBut General Relativity does predict effects we can observe that classical mechanics does not (e.g. the observed values of gravitational lensing, the observed orbit of Mercury, the effect of the Earth's rotation on satellites in its gravity, etc.).

The orbit of satellites and Mercury were solved by a math teacher before Einstein by assuming gravity travels at the speed of light.  Gravitational lensing is the result of a photon traveling at a relative speed < c, and its mass.

This shows the difference in ontology.

Case 1:  Assume the photon travels at c, and also is affected by gravity.  You'd probably get the wrong answer.

Case 2:  Assume the relative velocity of a photon is according to Galilean Relativity, and that it has mass and responds to gravity.  You get the right answer.

Case 3: Assume that "space" is expanding and curved.  You get the right answer.

You either have to reject the view that the photon travels at "c" for both frames of reference, or you have to monkey with "space".

QuoteAnd again, I can't emphasize enough how big of a problem this reification of the idea of "expanding space" is within physics.

You're more of the expert on this, so I take your word.  What about "curved" space?

Is you point that General Relativity is simply "shut up and do the math."?
"But he that doth not believe, is already judged: because he believeth not in the name of the only begotten Son of God (Jn 3:18)."

"All sorrow leads to the foot of the Cross.  Weep for your sins."

"Although He should kill me, I will trust in Him"

cgraye

#86
Quote from: james03 on January 25, 2025, 12:42:08 PMThe orbit of satellites and Mercury were solved by a math teacher before Einstein by assuming gravity travels at the speed of light. Gravitational lensing is the result of a photon traveling at a relative speed < c, and its mass.

I'm not sure what calculation you're referring to, but simply adding a finite speed of gravity to Newtonian mechanics will not suffice to explain the orbit of Mercury, and it would also introduce violations of the conservation of angular momentum and energy.  The deflection of light by gravitating bodies is predicted by Newtonian gravity if you assume an effective mass of the photon, but at only half the value predicted by General Relativity, which is the observed value.  And none of that predicts that the rotation of gravitating bodies will affect the motion of orbiting satellites, which General Relativity does.

QuoteCase 2:  Assume the relative velocity of a photon is according to Galilean Relativity, and that it has mass and responds to gravity.  You get the right answer.

Case 3: Assume that "space" is expanding and curved.  You get the right answer.

You either have to reject the view that the photon travels at "c" for both frames of reference, or you have to monkey with "space".

Even granting your case 2, a problem is that for all we can tell, photons do not have mass and cannot travel slower than c, no matter how hard we try to make them do so.  But on the other hand, "space" expanding does not mean anything other than that objects are moving apart from each other, and that is a phenomenon that we observe every day.  So which scenario requires more controversial assumptions?

QuoteYou're more of the expert on this, so I take your word.  What about "curved" space?

I'm not sure whether you are talking about curved spacetime, which is how gravity is modeled in General Relativity, or the curvature of space (not spacetime) in the standard model of cosmology.  Since observations indicate that space is not curved at cosmological scales, I will assume you are referring to the former.

"Spacetime" is a mathematical abstraction - a 4D manifold that possesses intrinsic curvature.  Since it is just an abstraction, its curvature should not present any metaphysical problems for anyone.  But what is the physical meaning of this curvature, then?  How do you read it out of the abstract model into something that can measured in the real world?  The physical meaning of spacetime curvature is simply the observed effects of gravity.  More specifically, the geodesical deviation you will observe if you set up some test particles near a gravitating body.

For example, imagine you let some freely floating dust particles loose near one another close to earth.  What will you see them do?  You will see them accelerate away from each other.  Why?  Because gravity is stronger closer to the earth, so the dust particles closer to the earth will move toward it faster than the ones farther away from it.  So is that it?  Well, there is another effect here too - the clocks that travel with each dust particle will tick at different rates.

So in what sense is that "curvature"?  Well, because from special relativity we know that different observers will measure distance and time differently, we need some kind of objective quantity that everyone can agree on.  That is the spacetime interval.  It's similar to calculating the distance between two points in space from their coordinates with the Pythagorean theorem.  It's the same thing, except time is also in there with an opposite sign.  But what if you are in an intrinsically curved space like a sphere?  The Pythagorean theorem doesn't work anymore.  Or rather, it does work, but you need correction factors on each term.  Likewise, in special relativity, the Pythagorean theorem will let any observer take his measurements of the three dimensions of space and time and calculate the spacetime interval.  But when there is gravity, that doesn't work anymore.  Or rather, it does, but you need those correction factors.  It is the need for those correction factors in calculating the objective spacetime interval that is the meaning of spacetime curvature.

OK, why all the fuss with General Relativity?  Why not just take take Special Relativity and put it together with Newtonian gravity?  Well, you can.  Kind of.  That would be equivalent to only having a correction term on the time component of your spacetime interval.  In fact, that is Newtonian gravity, expressed in this way.  But that's not enough.  You also need correction terms on the three spatial components of your Pythagorean theorem as well.  It's just not obvious because they are usually so small you don't notice their effect.  But in cases with strong enough gravity, you do.  It's the missing correction terms on those spatial components that account for the failure of Newtonian gravity to predict the correct orbit of Mercury, not a finite speed of gravity.

So what's the difference with Newtonian gravity?  What is the physical cause of those spatial correction terms that is not captured in Newtonian gravity?  It's what I explained several pages back - Newtonian gravity considers mass (energy density) the only source of gravity.  But it's not.  Energy flux, momentum density and flux, pressure, and sheer stress also contribute, and those spatial correction terms capture their effect.  That is why General Relativity can predict that the rotation of the earth, not just its mass, will effect the orbit of satellites - because that angular momentum of the rotation also contributes to gravity.

QuoteIs you point that General Relativity is simply "shut up and do the math."?

By no means!  To simply shut up and do the math might work for engineers, but if your aim is to understand how the physical world works, you will need to do more than that.  You need to understand how the elements of your mathematical models relate to the things you can observe in the real world.  And you can do that with General Relativity.  But here's the thing... the principles of General Relativity are actually fairly simple, but the math is extremely complicated.  So complicated that to even write the entire thing out in terms of the physical observables would take multiple pages.  And then to actually solve those equations exactly is also incredibly hard - usually impossible if you want nice, closed form solutions.  Fortunately there are some solutions we can use to describe the situations we care about, like planetary orbits.  So if you want to actually do anything with the equations, you can understand the principles, but then you kind of do need to just shut up and do the math.  Or rather shut up and use the math that someone else already did a hundred years ago.  Or shut up and let a computer do the math, because you literally do not have time to do it by hand.

james03

QuoteI'm not sure what calculation you're referring to, but simply adding a finite speed of gravity to Newtonian mechanics will not suffice to explain the orbit of Mercury, and it would also introduce violations of the conservation of angular momentum and energy.

Unfortunately I can't find the cite.  A math teacher solved the precession of Mercury before Einstein.  I'll do some more digging.

QuoteThe deflection of light by gravitating bodies is predicted by Newtonian gravity if you assume an effective mass of the photon, but at only half the value predicted by General Relativity, which is the observed value.

Correct.  If you assume the photon is traveling at "c".  If the photon is traveling at a slower speed, the deflection will be more.  Similar to "bullet drop" and why high speed bullets are described as "flatter".

QuoteEven granting your case 2, a problem is that for all we can tell, photons do not have mass and cannot travel slower than c, no matter how hard we try to make them do so.

For all we can tell based on observation, a photon has mass.

For all we can tell based on observation, the velocity of a photon is not constant.  One example is red shift.

Quote"Spacetime" is a mathematical abstraction - a 4D manifold that possesses intrinsic curvature.  Since it is just an abstraction, its curvature should not present any metaphysical problems for anyone.  But what is the physical meaning of this curvature, then?

We get back to the Barandean breakthrough in QM, where ontology is returned.  Mathematical abstractions are a convenient math engine.  There is zero ontology to be expected.  There is zero physical meaning of curvature because space does not exist physically.  It only exists in the immaterial world.

QuoteTo simply shut up and do the math might work for engineers,

The statement "shut up an do the math" is from Quantum Mechanics, not engineering.

Now I'll throw a bone to Einstein, and that is his stressing that "time" is a forth dimension, and that if we want to accurately map the material world into the immaterial/spiritual realm of the minds of men, time must be given an equal footing.  This DOES open the door to relativistic effects with regards to time from a purely Galilean view.

In fact, time is more physical than "space".  Space is null and does not exist physically.  However time is deeply involved with physical processes.  In short, time is quasi-physical and "space" is null.


"But he that doth not believe, is already judged: because he believeth not in the name of the only begotten Son of God (Jn 3:18)."

"All sorrow leads to the foot of the Cross.  Weep for your sins."

"Although He should kill me, I will trust in Him"

cgraye

Quote from: james03 on January 29, 2025, 10:09:06 AMUnfortunately I can't find the cite.  A math teacher solved the precession of Mercury before Einstein.  I'll do some more digging.

One possible thing I can think of that this might be about is the value of the effective potential (V) as a function of the radius (r).  If you take the Schwarzschild solution to the Einstein field equations and calculate this, you get:

V(r) = -GMm/r + L^2/2ur^2 - G(M+m)L^2/c^2umr^3

Those first two terms are the same as you would get in Newtonian gravity, but the third is the contribution from the effects of General Relativity not predicted by Newton.  It has the speed of light squared in the denominator, so if you just let that be infinity you would recover the Newtonian solution.  Perhaps someone just figured out a fudge factor assuming the the speed of light is finite by fitting it to the observations.  Though that would not explain why it is there or account for the instability in the system that would result.

QuoteCorrect.  If you assume the photon is traveling at "c".  If the photon is traveling at a slower speed, the deflection will be more.  Similar to "bullet drop" and why high speed bullets are described as "flatter".

For all we can tell based on observation, a photon has mass.

For all we can tell based on observation, the velocity of a photon is not constant.  One example is red shift.

That is not an observation, that is a conclusion drawn from other factors.  If photons can travel at speeds other than c, why can't we create that situation more locally?  And for that matter, that is not the only interpretation of the redshift.  Why not conclude that the energy of light coming from distant galaxies was always lower?

QuoteThere is zero physical meaning of curvature because space does not exist physically.  It only exists in the immaterial world.

But objects exist in the material world, and we can measure the distance between them.  And we can measure the passage of time with clocks.  If the Pythagorean theorem does not hold with those measurements, wouldn't you say that curvature has some meaning in the material world?

I think you are getting too hung up on the significance of the term "spacetime".  In General Relativity you could just as well substitute the term "gravitational field".  Would it bother you as much to say that the gravitational field is curved?

QuoteThe statement "shut up an do the math" is from Quantum Mechanics, not engineering.

But it is done in engineering.  Engineers don't care about what anything is, they just care about having tools they can use to build things.  I don't think that's a good perspective to take for a physicist, whether we are talking about Quantum Mechanics or anything else.

QuoteIn fact, time is more physical than "space".  Space is null and does not exist physically.  However time is deeply involved with physical processes.  In short, time is quasi-physical and "space" is null.

Time is a measure of change.  Material things change, so time is in some sense a real part of the material work.  But material things can also be at different distances from each other.  So why would you say that distance is any less a part of the material world?

james03

QuoteBut objects exist in the material world, and we can measure the distance between them.  And we can measure the passage of time with clocks.  If the Pythagorean theorem does not hold with those measurements, wouldn't you say that curvature has some meaning in the material world?

The Pythagorean theorem is used to precisely solve the Sagnac effect.  People object because it allows the speed of light to vary, which is circular logic.  And Sagnac guidance systems are an example of locally observing photons traveling at speeds slower than and greater than "c".  You can explain the observed different speeds with various theories, however the only theory that has an ontology is accepting that the speeds are different.

QuoteBut material things can also be at different distances from each other.  So why would you say that distance is any less a part of the material world?

Distance is the immaterial map of the material world.  It doesn't materially exist.  It appears to be similar to a transcendental like "being" and "thing" or perhaps better described as a qualia.
"But he that doth not believe, is already judged: because he believeth not in the name of the only begotten Son of God (Jn 3:18)."

"All sorrow leads to the foot of the Cross.  Weep for your sins."

"Although He should kill me, I will trust in Him"