For a while, I wanted to write this post over a semantical disagreement of the question whenever something similar to “can the laws of physics change” or “could physical constants change over time”. It really boils down to your ideas about time, laws and complexity to answer but let me introduce the whole idea of the topic.
Introduction – Frame of reference
Time translation symmetry (that physical laws don’t change with time) is fundamental to physics (and if you accept the supervenient picture of different fields of the science of matter then to chemistry, biology, and so on as well) and usually taken as an observation, or a measurement – so within the epistemical or positivist context a fact of the world. In my opinion it’s always a problematic part of theories to take facts for granted, there is a reason why different realism theories exist, and I think nobody should or probably would claim that cosmological constants and field equations etc. can’t change with time because of the practical impossibility of induction you can never be sure of any laws derived from measurements.
However physicists often claim that for example cosmology is also a study of such metaphysical (as in, about physics) claims of laws, because measuring the observable universe is also measuring the past of the universe, therefore because of the finite speed of light, measuring how the known or yet unknown laws unfolded within these spacetime boundaries of the cosmos. We can and do use the techniques of Astronomical spectroscopy to measure many properties of distant stars and galaxies, such as their chemical composition, temperature, density, mass, distance, luminosity, and relative motion. It turns out that in all directions of space, no matter how far away, the light coming from distant stars and galaxies was produced within stars by the exact same process of stellar hydrogen fusion. This process works the same throughout time & space and so the stars work the same way throughout time and space. The process of stellar hydrogen fusion depends on a considerable number of physical laws. So by mere observation of the light coming from distant stars and galaxies, as noted by theoretical astrophysicist David N. Spergel, we observe that physical laws don’t change with time, that they were constant in the observable past (non-accessible) spacetime.
It’s important to note here that what these findings indeed prove and it seems to be little controversy about, is that the laws of physics didn’t change in the observable past spacetime of our cosmos. Induction really comes to play when we exaggerate these findings into the “future” or rather claim a different thesis saying that based on this, the laws of physics won’t or can’t change over time and it makes sense to judge different scientific hypothesis based on their value of prediction. Point is, that it would be problematic enough to claim otherwise, because the past is already non-accessible for us, there is no change taking place in that spacetime area of the cosmos from our perspective.
Still forming these questions with reference to our past, have physical constants changed with time? The fundamental laws of physics, as we presently understand them, depend on about 25 parameters, such as Planck’s constant h, the gravitational constant G, and the mass and charge of the electron. It is natural to ask whether these parameters are really constants, or whether they vary in space or time. Over the past few decades, there have been extensive searches for evidence of variation of fundamental “constants.” Among the methods used have been astrophysical observations of the spectra of distant stars, searches for variations of planetary radii and moments of inertia, investigations of orbital evolution, searches for anomalous luminosities of faint stars, studies of abundance ratios of radioactive nuclides, and (for current variations) direct laboratory measurements. So far, these investigations have found no evidence of variation of fundamental “constants.” The current observational limits for most constants are on the order of one part in 10^10 to one part in 10^11 per year. So to the best of our current ability to observe, the fundamental constants really are constant.
Light and these laws derived from constants may be taken as a “constant enough” experience but given how thermal equilibrium is a possible outcome (or rather a limiting case of our theories such as black holes that will be mentioned in a moment) for the whole universe, we are only talking about scales. Light may be taken invariant in your scale but what about bigger scales? We could only test our “direct observations” within such a tiny frame of context, that I’d be interested in how would someone go on explaining singularities within a cosmologically constant context?
Based on the cosmic microwave background that is the oldest light in the universe, dating to the epoch of recombination that occurred about 378,000 years after the Big Bang. The current measurement of the age of the universe is 13.799±0.021 billion (109) years. So we have good evidence that electromagnetism, chemistry, gravity and nuclear forces all worked as they do today for a very large part of the entire lifetime of the universe, and indirect evidence that the standard model of subatomic particle physics dates all the way back to when the universe was only one second old. We have measurements from across all of space and time that show all measurements made to date indicate that the laws of physics settled down to their current values very shortly after the Big Bang event.
My objection is really about using our scales, because the age of the universe may seem like a really big number and not a tiny fraction of spacetime, but it’s really nothing compared to for example an estimated time required for a Boltzman brain to appear. The different epochs for example within the context of the Big Bang actually seems to me a proof that these constants can change over time, as new and new formation of matter emerges and influences how other particles behave. What is really an interesting question are the scenarios of our limiting cases here, for example how the big bang event itself and black holes or singularities in general may be such wild “fluctuations” in our theories that may break down or change these laws so rapidly that in case our theories may want to include these phenomena, we need to make them more and more complex. Up until what complexity, and what scale?
It is often claimed that once the universe had expanded enough and hence cooled enough the laws of physics settled down to their current values is not change over time but a change under extreme conditions which only exist very near to singularities. But what’s the difference between change over time and change over extreme conditions if these conditions emerge over time? Under general relativity it should be suspicious to handle space and time as separate coordinates of the events, as when discussing such general theories there shouldn’t be separated space from time, everything that happens in space should happen in a certain time as well. All nature does is natural, is an effect and an action. We shouldn’t make difference between spontaneus nuclear emissions and ones occuring in a laboratory, or initial singularities and other events to this extent that they would let us introduce meta-laws and consider those constant.
Someone could cite quantum fluctuations as an exception for example to everything having a cause, and claim that they apparently have no cause. The real issue here is that (meta-)physical laws are ridden with such exceptions, or limiting cases that are in fact are called singularities or dark matter and such. They show us the applicable range of phenomena, telling us that our laws are not yet proved to be universal, mostly pointing to the future as a hindrance to their predictive power. As the observable part of our universe grows by better measurements, we seem to have more and more general theories of matter because there are fewer of these extreme cases to be seen.
What time translation symmetry claims is that for example the weak electron force would never work differently under the same circumstances. Only the conditions matter, but whether the conditions happen now, or in 100B years, the laws are still constant. Water would have always boiled under the right circumstances and always will, even if we didn’t know about it. The fundamental forces of the universe working differently under near singularity conditions doesn’t mean that the laws change at all. It just means that they are more complex than we initially thought.
Let’s have a little break here with a related interview.
Are there physical laws at all?
Why don’t we include time itself into the conditions of phenomena? If we did, the epochs of big bang may be enough to prove in a certain way that the laws of physics do change. By definition we call relations laws if they have a prediction value that is just another way of saying that they are constant, or believed to be constant. How complex these theories and laws could get? With how complex laws would the so-called extreme conditions disappear? The problem only seems to be semantical because we define scientific laws themselves by these principles, so we could either ask if there are such laws at all. Even the introduction of meta-laws governing the potential change of laws may be themselves made to be laws by the very definition, so we can’t argue about this system from within the frame of these constants.
These are the main concerns of my inquiry into the issue, and they straightforwardly lead us into the metaphysical domain of change (constancy) and time or potentiality (prediction) itself. It’s important to see that where I may discover a potential infinite regression into the domain of meta-laws, it becames not even a theoretical issue of physics because in (semantical) practice there is not a single difference between different laws as far as theory is concerned. They are always the relations that remain the same, and whether you take time into the context of conditions or not, they may either get discarded or become different laws (meta-laws) for different epochs or without time included they may have just extreme conditions (singularities) understood to be exceptions.
The past is in a single way in either physical or mental memories while the now is happening and to understand the nature of change and laws themselves we need to turn into now, not to the past. Not just by definition of laws as constant relations but by definition of past as well, we consider every event that is constant as past. You can frame it in a different ways and consider the past as constant or everything constant as past, the fundamental difference between now and past is always the same – that past events are not accessible to any (observer) participants of the universe. The need for such definitions is to say the least possible amount about time and change itself and make our work even more complex as it already is.
Are there constant relations at all? For constant relations you may need constant participants (monads, dhammas, matter, particles, etc) or else you risk entering the domain of the formerly called extreme conditions, where the relations become unable to be applicated as we are clearly not stating such universal metaphysical laws such as the most universal statement that “everything exist”. The most universal and constant relation should be existence itself, as everything being related to something can be said to be existent. When we consider physical laws or more strict metaphysical laws they must have a certain domain of application, a set of participants it relates to.
We can probably agree that the set of these participants must be constant enough for at least the law to appear. Electrons for example must be constant enough to produce any effect we can measure and see as a constant relation between either other electrons or different participants – even if just virtual particles. For constant relations we will need constant or identical participants, so the theory of such physical laws will be equivalent to a theory of change that is equivalent to a theory of time.
The difference between constant or identical participants is not much in this framework, because the relation itself gives us the reference to calling something identical as long as the relation is applicable to the participant. In other words I mean that we can call something an electron if it behaves (respects in motion and every measurements) according to our theorized relationship between electrons and different phenomena. The criterion of constancy is the same – such an electron hasn’t changed if it still behaves according to our theory and especially the instantiated attributes gained through measurements.
As could be seen from the last statement on the scale of instantiating for example, quantum phenomena can be theorietically problematic to the extent of John Wheeler postulating a certain idea of the one-electron universe where is only a single instance of subatomic particles existing in different spacetime regions.
I don’t want to get into the epistemological debates about the quantum realm as it’s a huge topic in itself, let’s just regard constancy and identity as pretty much the same for our issue of laws themselves. Just how constant could laws be without constant participants? Not only that, but their real predictive value depends on the constancy of the involved sides too, that may change quite a huge amount over spacetime variations. Scaling is very important here because if we consider the option of no change happening ever again after a certain period of time (the well-known heat death scenario) then these laws really govern a tiny spacetime dust of chaos before an eternal constancy. We get to see the laws of a turbulent phenomenomal storm before the real events of a “no change” cosmos with very different and uneventful scenario of physical laws. Could as well just say that we are still in just a different epoch of the big bang.
Applying these to the former discussion for example, for the law of boiling water to be considered constant, the compound known as water should have universal constancy. In the future there may be no such thing as water that would be required for the law to be considered a constant law. Of course even if water suddenly boiled at a different temperate then shifting into meta-laws governing boiling of that compound or straight switching the law itself would not be considered change by semantical grounds so it could be just as well argued that changing participants don’t mean changing laws.
We really came a full circle, because it is against the definition itself of laws to change. But again if we understand that definitions are mind-made categories then it begs the question if such laws even constitute reality or not and we need to reduce them into a concept of nature and/or the participants of the laws themselves to answer such a question. The conservation of energy for example only says after a throughout analysis stripped down to the bare phenomenon, that something does remain constant, there are constant participants constituing reality.
Is physics just applied structural realism?
When we leave the semantical ground behind, it really seems that compounds and matter may change and even disappear but the foundational centre of physics – finding the constant laws of reality could just as well remind us of structural realism, the idea that only relations and therefore structure is real (constitute reality). When I call physics applied structural realism, mathematics itself could become theoretical structural realism but that really depends on one’s view about numbers themselves.
To get to the core of this issue about laws, they are really relations between different participants as stated above, while structure is really just a set of relations either in a system or not. (A system here can be understood as the structure of relations.) A stance on what is constant is really a stance on realism, because any kind of realism in it’s core deals with the question of what is constant, what is unchanged and “at the deepest layer of phenomenas”. In abhidhamma it’s not a surprise that it could be just as well called dhamma realism, because according to that framework dharmas constitute the so-called ultimate reality.
It’s interesting to see how philosophical issues are lot of the times considered just a semantic problem by positivists, because we could consider the case of shifting to meta-laws that could be considered already meta-physical (again, as in about physics) for the analitic philosopher, just as Noether’s theorem that is a statement about the relation between laws and constants. The mentioned theorem informally states that to every differentiable symmetry generated by local actions, there corresponds a conserved current. Yet there is a qualitative difference between the universality of such theorems that still have the domain of physical symmetries themselves, while in what is philosophically considered metaphysics and arguments of realisms we usually have the domain of any kind of relations.
Summary – Could be metaphysics enough to decide?
I wrote this post to clarify my thoughts about the issue, not to settle it down. Certainly a long way just to state a relation between physics and structural realism, but lot of the times it may appear that certain applications of our categories and structures have wider universality than they actually do. I didn’t even mention the issue that whenever we talk about nature, we ought to include participants acting according to the knowledge of these laws and humans that I mainly mean here may be agents that decide based on this, leading to the issue of emergence I mentioned in an earlier post on this blog but didn’t write about extensively because of lack of time and research.
Physics is concerned with measurable entities and the constancy of the relations between these entities depend on the constancy of the entities themselves unless we base their constancy on the relations, defining the relata based on the relation, what would be structural realism. In other words if we think that relations are still “there” and real even if there are no relata belonging to their domain, and they don’t emerge out of the entities or processes behaving a certain way, having no base on their own then we put the structure forward to the structured.
In either way, a following discussion should be able to explain the consistency whether the structured or the structure itself, leading to the same debate that would involve a proper theory of change and therefore time itself, that should inevitably touch the subject of transitions and identity, while trying to settle down a fundamental difference between the structure and the structured themselves – if we can talk about relations of relations as well, seeing them as just another type of entities whether having applicable measurements or not and therefore also debating the nature of matter.