# Speed of light may have changed recently



## matt-browne-sfw (Oct 31, 2007)

This article is already three years old, but I've just stumbled on it... Quite mind-blowing. Could this make FTL travel possible some time in the future? What are your thoughts?

Speed of light may have changed recently - 30 June 2004 - New Scientist

The speed of light, one of the most sacrosanct of the universal physical constants, may have been lower as recently as two billion years ago - and not in some far corner of the universe, but right here on Earth. 


The controversial finding is turning up the heat on an already simmering debate, especially since it is based on re-analysis of old data that has long been used to argue for exactly the opposite: the constancy of the speed of light and other constants. 


A varying speed of light contradicts Einstein's theory of relativity, and would undermine much of traditional physics. But some physicists believe it would elegantly explain puzzling cosmological phenomena such as the nearly uniform temperature of the universe. It might also support string theories that predict extra spatial dimensions. 


The threat to the idea of an invariable speed of light comes from measurements of another parameter called the fine structure constant, or alpha, which dictates the strength of the electromagnetic force. The speed of light is inversely proportional to alpha, and though alpha also depends on two other constants (see graphic), many physicists tend to interpret a change in alpha as a change in the speed of light. It is a valid simplification, says Victor Flambaum of the University of New South Wales in Sydney. 


It was Flambaum, along with John Webb and colleagues, who first seriously challenged alpha's status as a constant in 1998. Then, after exhaustively analysing how the light from distant quasars was absorbed by intervening gas clouds, they claimed in 2001 that alpha had increased by a few parts in 10^5 in the past 12 billion years. 


(...)


----------



## Nik (Oct 31, 2007)

Um, hasn't that been thrown into doubt by both MoND, where gravity *may* drop-off slower than inverse-square over extra-galactic distances, and the recent finding that old supernovas may have been brighter than modern ones...

IIRC, latter is due to more heavy elements present in stars formed in second and subsequent generations 'catalysing' supernova process...

( Is our Sun 2nd or 3rd gen ?? I forget... )


----------



## matt-browne-sfw (Oct 31, 2007)

Nik said:


> Um, hasn't that been thrown into doubt by both MoND, where gravity *may* drop-off slower than inverse-square over extra-galactic distances, and the recent finding that old supernovas may have been brighter than modern ones...
> 
> IIRC, latter is due to more heavy elements present in stars formed in second and subsequent generations 'catalysing' supernova process...
> 
> ( Is our Sun 2nd or 3rd gen ?? I forget... )



I think it's third gen, because it's metal-rich. First is metal-deficient, second metal-poor...


----------



## Rawled Demha (Nov 1, 2007)

nik - i think the inverse square law holds for all point sources and spheres, irrespective of distances. not 100% sure though...


----------



## Delvo (Nov 1, 2007)

The inverse square law works for other things, like radiation intensity at a given distance, sound intensity at a given distance, odor intensity and chemical diffusion at a given distance (without interference by fluid currents like wind), and objects' apparent size when viewed from a given distance. But gravity doesn't follow it; gravity's relationship to distance is a "problem".

Gravity doesn't even follow any other such simple constant relationship to distance either. The constant relationship which was figured out based on planets and moons in this solar system turned out not to apply over longer distances; stars seem to go around within their galaxies too fast (which indicates stronger gravity than expected because the expected gravity wouldn't be enough to keep such fast-moving stars together in galaxies), and galaxies seem to be too attracted to each other. So it seems as if the exponent in the gravity-distance relationship not only isn't a 2, but isn't even a constant; it's a variable... and that's even if there is a single simple equation to predict it at all.

MOND (modified Newtonian dynamics) is/was an attempt to find a new equation for the relationship between gravity and distance to account for the fact that it doesn't drop off with long distances as quickly as you'd expect based on short distances. Another attempt at explaining the problem is dark matter: matter that you can't otherwise see because it doesn't create, absorb, or reflect light or any other electromagnetic energy (and thus doesn't have electromagnetic or chemical interactions with normal matter), but does have mass so its gravitational effects can be seen.

Most people consider the final nail in the coffin for MOND and other alternatives to dark matter to have been found in the last year or two in the form of nearly certain evidence of dark matter's existence. Two large visible-matter objects (nebulas or galaxies or a nebula and a galaxy or something like that; I have the name "Bullet Nebula" stuck in my mind here for some reason) collided at an angle that allows us to clearly see that they're two objects moving in two different directions before the collision. Based on the movements of other things around them, we can also see where the "center of gravity" is for the two things, and it's too far off to one side. So now we have gravity that's not just too strong over large distances, but not even converging toward the same center as the center of the visible mass... which indicates that some of the gravity is being created by something else separate from what we can see. And the center of gravity happens to be just where you'd expect it to be if its source were dark matter that accompanied each of the two visible objects before the collision but then kept going past the collision site due to its lack of interaction with other matter.


----------



## Nik (Nov 2, 2007)

Interesting but, IMHO, not yet conclusive due the complexity of those very-distant twin galaxies' mass-distribution and interactions...

IMHO, even MoND is still far from proven...

Um, the mass distribution in *our* galaxy is still a bit fuzzy. Even within nearest dozen light-years, the numbers at low-end of stellar/sub-stellar population just keep growing. Changes to the mass distribution stats are not so critical near core, where super-massive black-holes and giant stars rule, or even in our well-populated spiral arm, but may make a difference out where space *looks* empty...

Then there's wild-cards, like theorist Itzhak Bars (sp?) who's found that an extra dimension each of time & space ( = 6, else if Branes, 13 !!) makes several persistent physics problems go away: Bye-Bye infuriating Axions, bye-bye QCD's predicted StrongForce asymmetry, bye-bye some exasperating infinities, hello a possible link between 'everything else' and gravity...

Is Reality as we see it merely 4-dimensional shadows of 6-dimensional truth ?? D'uh, I wish I'd done that Higher Topology & Tesseracts course when my brain was young & nimble...

Would be a terrible irony if Mr Bars' elegant math does an end-run on MoND, variable_c etc etc !!


----------



## Rawled Demha (Nov 2, 2007)

a quick query - the inverse square law, is this based on the surface area of a sphere? (4*pi*r^2) i used to think that this explained many inverse square relationships, from light intensity to gravity and electricity because these are all modelled as point sources, which "radiate" or act in all directions. i was recently reminded, however, that newtonian physics is outdated, and was wondering if quantum physics threw this stuff off as well...


----------



## StewHotston (Nov 2, 2007)

It isn't based on that. Well not quite, the relationship is somewhat more the other way around.

However, to speak of a shape to the universe is inself problematic as how do you discuss the shape of something which encompasses its own existence?

I personally find the idea of multiple 'unseen unmeasurable' dimensions very problematic because they typically only work by introducing multiple new variables that can be fine tuned to make the theory 'fit' the data or 'solve' the problems. People like Bars end up introducing many more problems than they solve using these sorts of methods.

Reality as we see it is just that. Is it objective? Probably but then again even if it's not it is objective enough in my frame of reference to keep me from walking in front of moving cars and sticking my head in a mincer. 

Branes are nice - indeed the pressure to adopt 11/13/17 and 26 dimension versions of reality is loud in some quarters but my increasing conclusion is that these seem to arise from a specific school of physics that itself is finding it harder and harder to explain reality. The Higgs Boson anyone?

The great problem for HEP atm is quantum gravity. So to that there's no working/workable answer at the moment, just lots of theories which don'd/haven't proven themselves in any real empirical manner. 

(as for those theorists who are desparately trying to get us to believe having evidence isn't a hallmark of a decent theory, well we have a word for them - homeopaths...oops sorry astrologers, sorry ummm well you get the idea).


----------



## Nik (Nov 3, 2007)

*quarks etc...*

Um, paradigms do change: Even I've seen quarks go from wry physicist's quip to high-school text-book...

Gravity is still the problem. It is so unprintably weak that studying it in isolation is an experimentalist's nightmare. How do you look for theoretical anomalies if you can't measure it reliably ??

There's some *weird* lab-scale phenomena around eg spinning superconductors, but even minor-order EM effects swamp gravity...

And gravity waves *should* exist, but the long-path optics either have insufficient sensitivity, or the sheer bad luck to have had no significant event to measure...

IIRC, even a modest supernova in our galaxy should rattle the grav-wave lines, light the neutrino detectors and generally brighten astronomers' lives...


----------



## matt-browne-sfw (Nov 8, 2007)

Excellent discussion! I wasn't aware that so many fine scientific minds hang around here at the chronicles network...


----------



## Quokka (Nov 8, 2007)

and just to show that not all minds are fine or scientific 


This is a bit off topic but still probably the best place to post it. I've been wondering why is the speed of light linked to_ E=mc2_ or is it just a suitably high number?

I suppose what I'm asking is if Einstein had been out with _m_ by a few % either way would we just have arranged measurements differently or would the theory have not fitted as well?


----------



## Rawled Demha (Nov 8, 2007)

i think its just a suitably high number. the idea here is that the energy of an object is directly proportional to its mass. or, E=km, where k is a constant. 

the energy of an object is huge, when you consider all the forms it can take - gravitational, electrical, nuclear. im not sure where we defined k to equal the speed of light squared, but it may have been because there is not much else that can take a larger value than the speed of light. 

i think it may be impossible to verify, not until we have another scientific breakthrough.


----------



## Delvo (Nov 8, 2007)

In physics equations, there's no such thing as just a suitable guess that amounts to merely "really high" or "really low". A number is exactly what it is; try using any other number and you get wrong answers. In fact, Einstein was not the first person to imagine mass-energy equivalence or interchangibility; he was just the one who determined the correct formula when other speculations about the relationship between mass and energy were no more precise than "little mass, lots of energy". But his description of the relationship is the one everybody's heard of now because his was the one that gave specific numbers, which is what made it worth having at all.

A formula like E=mc[sup]2[/sup] would be worthless and unusable, and never become famous and widely used, if c had been just a guess... exactly as worthless as a price label on meat saying it's "some number of dollars & cents per pound". If you can't really determine the actual price based on weight or the actual weight based on price, then the formula isn't really a formula at all.

The reason it has to be _c_ and not some other "really high number" is the same as the reason why light moves at that speed inthe first place. It's not just simply the speed of light like Mach 1 is the speed of sound or 95 MPH is the speed of fastball. Those are circumstantial things are arbitrarily defined/measured by us, and routinely change with the circumstances such as air density without any effect on the rest of the workings of the universe.

But _c_ is the universe's own natural built-in speed, one of the basic fundamental parts of how the universe works, even more truly fundamental than speed and time themselves. It pops up in various ways in the formulas and calculations of physics; the movement of light and the relationship between mass and energy are just two of its roles. Another is that massless particles carrying other forces also move at that speed, not just the electromagnetic force, so _c_ appears in their equations for range and power and such as well. Also, basing all concepts of length and time on fractions of _c_ is the only way to make mathematical sense out of the distortions of space and time with powerful gravity or acceleration in relativity. It's also found in the definitions of some of the other basic natural physical measuring units of the universe, the Planck units, even the ones that you might not think had anything to do with distance, time, movement, or speed, such as the Planck electric charge, temperature, and mass. If space and time themselves are broken up into tiny discrete chunks like the picture on your computer screen is broken up into pixels, then _c_ is the speed of one "pixel" of space per one "pixel" of time, one tick of the universe's clock... and that distance, the Planck length, is also the minimum distance at which the laws of particle physics and quantum dynamics can possibly even theoretically have any meaning... and it's derived from _c_.

Also, the way the eqution E=mc[sup]2[/sup] was developed was not just by seeing that a high number was needed and coming up with one. It was derived from a few other formulas that were already established at the time, by combing & substituting for equivalent terms, cancelling out equivalent negatives and positives, setting certain variables to the convenient "natural" numbers 0 and 1, and such. I don't remember the original formulas it was distilled from or exactly how they were brought together, though...


----------



## Rawled Demha (Nov 9, 2007)

im sure you're right about much of what you said there delvo, but i do believe that in another 50 years or so E=mc^2 will become as redundant as F=ma, useful in only schools or elementary equations. its just that at present, we do not have a theory that would give us more accurate and reliable results. im not sure, but i feel there must be examples of where relativity falls short, much the same way Newtonian mechanics was lacking.


----------



## Delvo (Nov 9, 2007)

Rawled Demha said:


> i do believe that in another 50 years or so E=mc^2 will become as redundant as F=ma, useful in only schools or elementary equations.


I don't know what you mean by either of those. "Redundant" means doing the same thing that something else already does better or did first, but if one equation accurately describes the relationships between its variables, no other equation can possibly do so, which makes redundancy impossible. And I don't know what would make an equation only useful in the settings you describe and no longer useful in other settings, but that hasn't happened to the one for force, mass, and acceleration. It's still used every day by engineers, developers, and technicians all over the world, in the design, production, and use of all kinds of machines.



Rawled Demha said:


> its just that at present, we do not have a theory that would give us more accurate and reliable results.


If you mean the conversion ratio between mass and energy is not actually the square of the speed of light, you're going against not just the formula's theoretical and mathematical foundation and its connections to various other related phenomena and their equations, but also decades of testing in a variety of different methods that have found it accurate and reliable... so much so that the most common unit in which subatomic particles' masses are measured is also the unit in which their energy is measured, with no distinction at all, which there'd have to be if this formula were inaccurate or unreliable.



Rawled Demha said:


> im not sure, but i feel there must be examples of where relativity falls short, much the same way Newtonian mechanics was lacking.


There is no doubt of that; it's something the physics community is already aware of and working on. Relativity and quantum mechanics have some contradictory ideas built into them, such as relativity's curved, stretching, squishing space-time as opposed to quantum mechanics's complete lack of such structure to space--a difference which carries with it the fact that one has an explanation for gravity (as a distortion of space-time geometry) and the other has none or a completely different one (as a "force" like electromagnetism and the strong & weak nuclear forces). So far, no test has been possible in which the two theories would lead to different predictions, so tests have really only been able to deal with one or the other at a time, and they both keep getting shown to be right on their own. (Similarly, Newton was also right and still is; it's not a matter of right or wrong in this case, but just when to use which theory.) A large portion of the work of physicists today is dedicated to figuring out how relativity and quantum mechanics can be reconciled into one self-consistent combined theory (which, if it existed, would be called "quantum gravity") or come up with a test in which they should give different predicted outcomes...


----------



## Rawled Demha (Nov 9, 2007)

no, i agree that all of these theories are useful. however, i do think that we are quite a way from the GUT (grand unified theory) which would always give THE right answer, ALL of the time. as you said, with these theories we currently have, the problem is mainly knowing when to choose which theory. the GUT would be the only theory we need.

i view science as the idiot's guide to the universe - all we need to do it is logic. if i bring it back to maths, if you are comfortable with the concept of 1, then you can put 1 + 1 = 2. 

logically, all the rest of maths follows. you can add any numbers together. then you can subtract and multiply. then you can divide. then you can do algebra. and so on. 

when we as a race have mastered science, anybody could solve any problem, simply by using the GUT. the problem most of us have with science, is, as you rightly put, and forgive me for repeating myself, is not knowing what to use when.


----------

