The title "

Disturbing implications of a cosmological constant" seemed interesting enough to read the article by Dyson, Kleban & Susskind. I will agree that the whole idea of constants is to be handled with care. In a universe that seems to be in constant change, I assume few constants beside change itself. Pretty soon I, again, realize that my thinking and the common sense of science doesn't play well together. From my perspective, this article, just like all credible articles, is based on assumptions that are highly questionable. The following examples will not do justice to the article as a presentation of an idea. To criticize the authors position is not my point, but to show some examples of how differently we can apply logic and reasoning.

*Present cosmological evidence points to an inﬂationary beginning and an accelerated de Sitter end. Most cosmologists accept these assumptions, but there are still major unresolved debates concerning them. For example, there is no consensus about initial conditions.*

I do not agree. There is an absolute consensus in that, whatever the initial conditions, they cannot allow for energy to be generated within the constraints of this condition. One way or the other, Everything we have today must have been here/there all along, albeit in a totally different configuration, as vacuum, virtual, potential etc. The consensus is that the universe expands by inflation of what is, not by a "what is"-process generating what becomes next.

*Another problem involves so-called transplanckian modes. The quantum ﬂuctuations which seed the density perturbations at the end of inﬂation appear to have originated from modes of exponentially short wave length. This of course conﬂicts with everything we have learned about quantum gravity from string theory. The same problem occurs when studying black holes. In the naive free ﬁeld theory of Hawking radiation, late photons appear to come from exponentially small wavelength transplanckian modes*

This is no problem at all if there wasn't such consensus about the initial conditions. If the universe is self-generated from a point of singularity, the transplanckian modes comes naturally from the first series of exponential growth. That would have occured at the speed of light, so there was probably a bang-ish event, but if you ask me, it wasn't big as we normally assume. It

*became *big at lightning speed, it

*grew *bigger at an exponential rate. The rate in question is of course the frequency of energy emission.

*In our opinion both the transplanckian and the late time problems have a common origin. They occur because we try to build a quantum mechanics of the entire global spacetime–including regions which have no operational meaning to a given observer, because they are out of causal contact with that observer. The remedy suggested by the black hole analogue is obvious; restrict all attention to a single causal patch [28, 8, 9]. As in the case of black holes, the quantum description of such a region should satisfy the usual principles of quantum mechanics [2]. In other words, the theory describes a closed isolated box bounded by the observer’s horizon, and makes reference to no other region. Furthermore, as in the case of black holes, the mathematical description of this box should satisfy the conventional principles of linear unitary quantum evolution.*

This is saying a QM of Everything should better disregard some things that have no relation to the observer. Well, then it is not a theory of Everything, but of Empiricism. To manipulate reality, we are advised to put it in a box, enclosed by the observers point of view. Then we are told that this should satisfy the usual and conventional principles. Of course it will, and if that is the point of science, have a nice day. I prefer to think of these questions in unusual and unconventional ways. Experience tells me that the conventional ways usually don't lead to an answer in this case. In fact, they never do.

*The essential point can be illustrated with an analogy. Instead of the universe, let’s consider a sealed box full of gas molecules. Start with a particular low entropy initial condition with all the molecules in a very small volume in one corner of the box. The molecules are so dense that they form a ﬂuid. When released the molecules ﬂow out from the corner and eventually ﬁll the box uniformly with gas. For some time the system is far from equilibrium. During this time, the second law insures that the entropy is increasing and interesting things can happen. For example, complex “dissipative structures” such as eddy ﬂows, vortices, or even life can form. Eventually the system reaches equilibrium, and all structures disappear. The system dies an entropy death. This is the classical hydrodynamic description of the evolution of a “universe”. But this description is only correct for time intervals which are not too long.*

No we're at this darn box again. This analogy is almost too dense to comment on. What in this scenario requires the density of molecules to be "in one corner of the box" and not some other, or why not in the absolute center? What is the actual state of the interior of the box, apart from housing a tiny singularity of gas molecules; is it a vacuum, has it any kind of energy or charge, what material is it made of, has it inherent velocity relative to an outside the box, is "force" applied to maintain the molecules being inside etc. We are just assuming a particular situation without questioning its relevance or inherent logic. How can the molecules be "released" if nothing is there to release them (assuming we do not consider divine intervention), how can a closed system in extremely low entropy have one part (gravity/potential/contraction) sitting in a corner doing nothing, while the other part would be kinetic and run like hell to that corner. Or it is not kinetic, but then one might wonder what makes the system entropy "low". The flaw is not the applied time interval, but that the whole scenario is totally irrelevant in relation to reality. It violates about everything we otherwise consider "true" or law-like. It breaks existing principles in order to produce new principles. Ignoring this problem, authors focus attention to time intervals insted.

*The implication of such a description, as we have suggested in Section (1), is that Poincare recurrences are inevitable. Starting in a high entropy, “dead” conﬁguration, if we wait long enough, a ﬂuctuation will eventually occur in which the inﬂaton will wander up to the top of its potential, thus starting a cycle of inﬂation, re–heating, conventional cosmology and heat death. The frequency of such events is very low.*

No, starting in a dead configuration, the system will remain dead, no matter how long we wait. This is where science always turn to religion and mysticism and I will not follow that path. If truely dead, it will not resurrect in any physical sense. That is absolutely true, and no elaborate equations or statistical magic wands can change that. They can be true to scientific convention and usual thinking, but the universe couldn't care less, nor can I. What

*is* possible on the other hand is that what seems to be dead is in fact alive and kicking, but we regard it as dead because we cannot observe this by empirical means. That is the flaw mentioned earlier, to disregard what is not in contact with the observer. If you avoid that, the symmetry breaking of the initial singularity can be understood naturally, without need for mysticism and wierd phenomena like infinite density, by means of "fluctuations" in a "vacuum space" suddenly exploding out into empty space.. The frequency of such events is indeed low, as they are of life being born from death.

Enough for now. I will go back to imagining the Planck length as the least distance of nothing, as opposed to something. I believe that is the interface of interaction /fusion, where quantum systems functionally are facing each other.

But then again, I'm just a crackpot with a foile hat. Everyone know there cannot be a measure of "nothing". That's just stupid.