Perturbative String theory reduces those ~20 constants to one (the string coupling constant), plus a choice of vacuum, which is then of course the crux of the problem, as the effective field theories we observe don't map uniquely to a particular high energy string theory vacuum (as far as we know). It should also be obvious that while those constants are indeed important, the initial conditions are potentially more important and a great deal of effort has been put into explaining why from fairly generic initial conditions we would expect the universe as we observe it to arise (inflation theory).
String theory was supposed to simplify everything by replacing all the vertex factors with the free space propagator for a string. But then it turned out that string theory wasn't that simple and actually made things even more complicated...
I think that's something any software engineer can relate to.
(please correct if my summary of string theory is incorrect, I only have a vague understanding of it)
How many dimensions are there in choice of vacuum? My understanding was that it was still "many", in which case this merely reframes the problem rather than solving it.
The answer to this question has changed over time. My knowledge of the history is incomplete to say the least. During the first super string revolution, when it was understood that perturbative string theory would give a description of perturbative quantum gravity, the goal was to demonstrate that there were only a few choices of compactification manifolds available from consistency arguments (anomaly cancellation and unbroken super symmetry in four dimensions, probably the most well known paper is "Vacuum configurations for superstrings" by Candelas et. al.)
Ignoring heaps of details, the result was that the vacuum manifold (that is a solution to the 10d vacuum Einstein Equations) had to be some 4 dimensional symmetric space (usually taken to be Minkowsky Space) times a Calabi-Yau manifold, those are Kaehler manifolds with vanishing Ricci curvature (those have non-vanishing constant spinors, which you need for unbroken supersymmetry).
Initially it was believed that with additional physical consistency conditions and by relating the geometry of the Calabi-Yau manifold to known physical parameters, like the number of generations of particles, it would be possible to narrow down the number of "stable" vacua to a limited number.
While there were discovered a fair amount of dualities between different string theories, with the discovery of D-Branes and Flux Compactifications (which stabilize the moduli of a huge number of potential vacua, the general idea is that the space of calabi-yau manifolds can locally be paramterized as a finite dimensional manifold and those paramters would show up as massless fields that aren't observed, if there weren't any fluxes they couple to), the number of potentially stable vacua is now sometimes quoted as 10^500.
In itself this is not too concerning, if you think about the fact that there is a infinite number of solutions to the Einstein Field equations and general relativity is still a very predictive theory. In practice this means that you carefully engineer the vacua you study, for example with F-theory techniques in order to relate them to observed phenomena.
You are right that it just reframes the question, the low energy effective field coupling constants turn up as vacuum expectation values of certain fields, but in an interesting way. The idea is then to use relations and identities discovered in string theory (for example AdS-CFT) to study the low energy theories.
Yes, by no means let me be interpreted as claiming that it's not an interesting reformulation. I'm just saying that for example trading 20 arbitrary parameters for 20 differently-inter-related arbitrary parameters is still just a reformulation of the question of where those 20 dimensions of information came from for our universe.
(And when I put it that way, in information theoretic terms, one notices that we must also take as parameters the description of the system that consumes those 20-some parameters in the first place. Even if one did come up with an answer for the parameters that question would remain below it. Of course in the end you eventually and inevitably are going to hit some form of "Just Because".)
Works for creationists, though. Their constant is "God" who they then define as a number of infinities on various axes. Then they tell you that you're not allowed to question it.
Isn't the consensus about string theory that it replaces standard model because it is fine-tuned to the knowledge we already have? I mean if you take whatever information, you can create infinitely many theories and equations on how to get the numbers within that particular set of data. But only some of these theories will predict numbers not in the data set yet. This seems to be the case with string theory. It kind of describes standard model in an alternative way, and the equations often hold up, but I don't remember case where it predicted anything correctly. Instead, every time new information is found, string theory itself is changed to match new data.
String theory much like Gauge theory is not fine tuned to any set of observed phenomena and for the most they don't make numerical predictions. What they do predict is the general shape of cross-sections for different particle species and corrections to them in the case of Gauge theories and much in the same way one can work out various String scattering cross sections. In the case of Gauge theories this is how you determine which of them fits the measured cross section best. In the case of String theories this failed when it was still believed to be a theory of the strong interaction. It eventually was realized that the string scale lS is related to the Planck lP by
lP = lS * gS^{1/4}
where gS is the vacuum expectation value of the Dilaton. It is believed to be of order one, so lS is ~10^35 meters. The current accessible length scale at the LHC ~10^18 meters, so you have 17 orders of magnitude between them and in particular no reason to expect the string scattering cross sections to relate to anything we can observe directly.
What is found instead is that string scattering amplitudes assemble in a way that predict that the massless string excitations form Field theories already known, like type IIB supergravity, IIA supergravity, SO(32) and E(8) x E(8) coupled to supergravity and a few others. What is striking about those initial calculations, is that from field theory arguments those field theories are the only supersymmetric ones in 10 dimensions incorporating gravity.
Perturbative String theory in its current form makes most of the model input data part of the "vacuum geometry", much in the same way Einstein did with general relativity. String theory in addition has the advantage that it gives a consistent perturbative description of quantum gravity.
String phenomenology, that is the study of String theory with the purpose of making low energy predictions is still in its infancy. Its value at the moment is primarily in coming up with consistent extensions of the Standard Model, the alternative of conservative step-wise refinements (conjecture additional particles to exist and their coupling mechanism to the known particles, calculate cross sections and hope that they aren't ruled out by experiments yet, for example cause existing particles to decay that aren't really supposed to, like the proton) is still the majority approach. Dark matter and other theoretical considerations tell us however that the Standard Model is incomplete.
The attractiveness of the string theory approach is that in some ways it is more restrictive as you have to satisfy additional geometric constraints, whereas in standard gauge theory the only consistency constraints are unitarity and locality, which predicts that fundamental particles will have spin 0,1/2,1 and 2 and renormalizability which restricts you essentially to Yang-Mills theory, plus some extensions (Supersymmetry among others). What isn't fixed is the gauge group, the precise way Fermions couple to each other etc. Some of those can't be fixed because things like the coupling constants aren't actually constant but dependent on the energy scale (they are related by the renormalization group).
Most likely if we find some new particles at the LHC it will be possible to give them a low energy effective field theory description (maybe involving supersymmetry), if we don't find things like extra dimensions. In other words it is unlikely that we are forced to consider string theory in order to understand physics at the LHC scale, its value is that it potentially has better conceptual tools to come up with super-symmetric extensions of it.
Numerical constants alone don't fully specify the universe. They don't account for the laws that relate numerical constants to each other. What's missing is a sort of structural constant -- a minimal set of laws that describe all physical phenomena.
If the universe were the execution of a program, knowing these numerical constants would be like knowing all the... well... constants, in the source code, but missing all the code that binds them together.
You are viewing into these constants as numbers in memory in programming language. In physics, a constant actually means a law that describes unchanging way of how things interact. Think of these constants not as of simple numbers (1, 0, -1, etc.), but as of constant ways things interact (1 always attract -1, etc).
You're right, constants are only "half" of the equation, so to speak. The other half is the equations themselves. In information theory, the combination of the two is usually referred to as the Minimum Description Length (MDL):
It would be interesting to know what is the Kolmogorov complexity of the complete description of our universe. Or for that matter, what is the Kolmogorov complexity of our current understanding of the universe.
Is anybody here familiar with the book by Peter Rowlands? [1]
> Unique in its field, this book uses a methodology that is entirely new, creating the simplest and most abstract foundations for physics to date. The author proposes a fundamental description of process in a universal computational rewrite system, leading to an irreducible form of relativistic quantum mechanics from a single operator. This is not only simpler, and more fundamental, but also seemingly more powerful than any other quantum mechanics formalism available. The methodology finds immediate applications in particle physics, theoretical physics and theoretical computing. In addition, taking the rewrite structure more generally as a description of process, the book shows how it can be applied to large-scale structures beyond the realm of fundamental physics
I attemtped to do something similar with the CKM matrix for my undergraduate thesis. Didn't really work, but it was fun.
The CKM matrix is annoyingly almost symmetric, the off diagonal elements are almost the same magnitude. But they're not, so bah! Basically you spend a lot of time trying to come up with simple first order relations for the various quantities. Ideally you should be able to eliminate fundamental constants by writing them in terms of one another.
Is this from the article? I'm not sure how phi relates to the speed of light, Newton's G, Planks constant or the numerous other force & energy constants described in the article.
Yes, it's sad. It's as if dimensions/units don't matter. There was a story about this quack on NPR that thought he could prove Einstein wrong. Turned out the guy couldn't differentiate (or wouldn't?) between momentum and energy. All his units were mixed up. He didn't even see the flaw in that.
You're getting downvotes because you're making an assertion (phi is related to many cosmological constants) but are consistently not backing it up, citing any specific sources, etc.
It doesn't mean anything to compare a constant to phi unless that constant is itself dimensionless. What low level physical constants (and relations between them and phi) do you have in mind?
The idea is that in many particle physics calculations you don't use the charge of the electron alone. Every time it appears, it's multiplied by other constants like c or h, so you redefine it as a new constant that is the usual product you have to put in the calculations.
The problem with constants with units is that they mix real physics with the arbitrary choose of the measurement units, like the time the Earth do a complete spin divided by 24 by 60 and by 60 and other complete arbitrary chooses.
It's difficult to say if "e" the charge of the electron is big or small. But in many calculations you can use alpha that is clearly a small number (~=1/137) and try to use perturbation theory to get a good approximation of the actual result. (You can imagine this as a lot of Taylor approximations.) (There are a lot of technical details hidden in these calculations that can make a mathematician cry but a physicist happy.)
I suppose his answer might be more philosophical. All things in this universe are shards of the same one universe. Shards of this one universe thinking about what this universe is itself made of. As far as we know from the fact the constants cannot be derived from any other principle and must be measured directly, the universe is made of itself. The universe is one.
The one electron wiki entry literally blew my mind. I read the first sentence and thought "absolutely not". Then as I read into it thinking "give it a chance". My logic tells me this can't be the case ..... but man this is a wild theory to me.
The theory is more an amusement than anything, but as the article points out, it seems it led to a major impact with the exposition of CPT symmetry. It makes sense if you know of Feynman's earlier work with Wheeler (http://en.wikipedia.org/wiki/Wheeler%E2%80%93Feynman_absorbe...) which is actually a consistent theory of classical electrodynamics (i.e. half the electromagnetic interactions are "coming from the future").