Wednesday, April 25, 2018

Too thin em waves in graphene

I got from Sebastian a link to a popular article about finding that light can be squeezed to much smaller volume than the wave length. A naive application of Uncertainty Principle suggests that this is impossible.

The finding is interesting from the point of view of classical limit of TGD. So called massless extremals (MEs) or topological light rays are extremely general solutions of field equations (practically independently of the details of the action principle: it is enough that it is general coordinate invariant). The counterparts of MEs are not possible in Maxwellian electrodynamics but TGD allows them because of the extreme non-linearity of the underlying geometric variational principle for which the topological pair of Maxwell's equations involving no currents is identically true.

MEs are 4- surfaces describing the propagation of massless topological field quanta of induced classical fields characterized by light-like propagation direction and polarization orthogonal to it. Classical 4-momentum is light-like. The propagation occurs with maximal signal velocity, and there is no dispersion so that pulse shape is preserved. If there are several pulses they must propagate in the same direction. The analogy is propagation of laser beam in waveguide. MEs are be therefore ideal for targeted communication and MEs play a key role in TGD inspired quantum biology. I interpret them as space-time correlates of photons Bose-Einstein condensate of photons.

MEs have finite size scale in directions orthogonal to the direction of propagation and MEs can be arbitrary thin. I do not see any reason why they could not be thinner than wavelength. The graphene seems to provide a situation in which classical modelling by MEs makes sense.

The QFT limit is obtained from many-sheeted space-time by replacing many-sheeted structure with a region of Minkowski space made slightly curved. Gauge potentials and gravitational fields are sums of the corresponding induced fields at space-time sheets so that the effects of these fields on a test particle sum up although fields themselves are at different space-time sheets. The linear superposition of Maxwell's theory is replaced with a set theoretic union of the space-time sheets in M4×CP2. The effects of the fields of space-time sheets on the test particle sum up just like in superposition of fields in Maxwell's theory.

For instance, this allows a situation in which one has two MEs describing propagation of signals in opposite directions as far effects on test particles are considered. This gives rise to standing waves not possible in TGD as single sheeted extremals. Lorentz transforms give analogs of em signals propagating with arbitrary velocity smaller than light velocity. Even field patterns for which the QFT limit corresponds to vanishing fields because the effects on test particles are trivial are possible: both sheets however carry non-vanishing fields with non-vanishing energy-momentum density.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

New evidence for blackhole echoes

For more than year ago I wrote about the evidence for blackhole echoes suggesting that the blackhole created in blackhole collision detected by LIGO is not quite it is expected to be. The echoes suggest a two-layered structure such that the gravitational radiation reflects forth and back between two horizon like surfaces. In TGD framework these surfaces would correspond to ordinary blackhole horizon (more or less) and inner horizon at which the induced metric transforms to a metric with Euclidian signature: this is possible and also unavoidable in TGD Universe. Now Sabine Hossenfelder wrote a blog article about new evidence for blackhole echoes Niayesh Afshordi, Professor of astrophysics at Perimeter Institute. Now the 2.5 sigma evidence has grown into 4.2 sigma evidence. 5 sigma is regarded as a criterion for discovery.

For the TGD based model is described see the chapter Can one apply Occam's razor as a general purpose debunking argument to TGD? of "Physics in Many-sheeted Space-time".

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

CBM cold spot as problem of the inflationary cosmology

The existence of large cold spot in CMB) is a serious problem for the inflationary cosmology. The explanation as apparent cold spot due to Sachs-Wolfe effect caused by gravitational redshift of arriving CMB photons in so called super voids along the line of sight has been subjected to severe criticism. TGD based explanation as a region with genuinely lower temperature and average density relies on the view about primordial cosmology as cosmic string dominated period during which it is not possible to speak about space-time in the sense of general relativity, and on the analog of inflationary period mediating a transition to radiation dominated cosmology in which space-time in the sense of general relativity exists. The fluctuations for the time when this transition period ended would induce genuine fluctuations in CMB temperature and density. This picture would also explain the existence super voids.

See the the article CBM cold spot as problem of the inflationary cosmology or the chapter TGD inspired cosmology of "Physics in Many-sheeted space-time".

Friday, April 20, 2018

Have you ever felt like being a unified theorist living at wrong century?

What it is to be a unified theorist living at wrong century? You have a wonderful idea, an (or even the) idea of century, and you have devoted four decades of your life to develop it. Your life has been really difficult but it has been a fascinating adventure. Even more, wonderful new ideas are usually wrong or not even wrong that this was a success! You have developed a brand new world view, which you can apply routinely and there is continual flow of observations from internet to which you can apply it. You live in a paradize of theoretician.

There is however a little problem. You realize that you and your colleagues live at different centuries. There is no hope of communicating your marvellous idea to them. They are like primitive natives who hate you as taboo breaker. In past centuries taboo breaker was zombied or simply torn into pieces in bloody orgy. My luck is that the latter treatment is not used anymore but the experience of becoming academic zombie is not pleasant.

So, you want to convey your message but it is obvious that the colleagues in your generation and even later generations are unable to absorb it. They are not idiots or barbarians as individuals but as members of collective they behave so. They do not hate you bitterly as individuals but members of community. But why? Why your message is so irritating? In the following I explain the historical background to understand why they feel so negative about TGD.
I also discuss the problem of communication over the barrier between different centuries.

More than four decades of stagnation in theoretical physics

Theoretical particle physics and related fields such as theoretical cosmology have been in a state of deep stagnation for more than four decades. This period is an Odysseia, which started when GUTs were invented. Probably the inventors of this amazingly un-imaginative generalization of standard model never believed that it would become a dogma within few years. In any case, the astonishing proposal was that there would be no new physics between electroweak scale and GUT scale, something like 1000 times Planck scale: 13 orders of magnitude! Sound irrational and extremely implausible. GUTs led immediately to severe problems like proton instability but for some reason these problems were not taken seriously. Perhaps only the notion of Zeitgeist - also a notion related to collective consciousness - can explain this.

After GUTs came supersymmetric theories built using the same format, supergravity theory emerged, and eventually superstring theories leading to M-theory. At the last stages the contact with empirical reality - or low energy phenomenology as it was called- was lost completely.

Now the situation is dead calm. Most of colleagues probably regarded M-theory as dead and some of them have even courage to say it aloud. It is admitted that some big principle is missing from superstring theories. It is however impossible to even think about returning to the roots and admitting that string world sheets are 2-dimensional rather than 4-dimensional as space-time is. A jump of 4 decades to past is much more difficult than jump from electroweak scale to Planck length scale. It would be extremely painful to admit that the victorious parade was the self deception of century.

What is so irritating in TGD

Natural sciences to day rely on two basic dogmas and the blind acceptance of these dogmas has led in theoretical physics to a state of stagnation, which began from GUTs.

  1. The first dogma is reductionism as length scale reductionism representing natural sciences as a march towards shorter and shorter length scales. With the invention of M-theory we would have been taking the last glorious giant step from electroweak scale to Planck scale. The formulas of particle theorists would dictate all that can happen in Universe, everything.

    This creates in particle physicists a feeling of being Overlords of the Universe and explains the legendary arrogance of particle physicists noticed by Penrose. Arrogance does not help in communications or at least it makes them uni-directional.

    The story did not however go as was expected. Already the first step - GUTs led to a wrong track. Superstring models and M-theory were doomed to lead to even more astray. The misery is characterized using words like multiverse, landscape, and swampland. The connections with empirical reality have been lost completely.

  2. Second dogma is that consciousness is only an epiphenomenon, an illusion as David Dennett puts it. This simplifies the challenge of reductionism dramatically since there is no need to ask to questions like "What life is?", "What after death?", "What is consciousness?" and all the related questions.

    This dogma makes world very simple. You just identify the correct action principle and invent some tricks to do the path integral and this is it.

    On the other hand, belief in this dogma implies that theoretician loses gigantic amounts of information that could help to develop theories. Particle theoreticians become totally dependent on what comes out at LHC. Even worse, if some anomaly comes out and is not an allowed kind of anomaly, it is forgotten. Quantum gravity theorists suffer the same problem: Planck scale dogma states that the needed data bits are obtained only by making experiments using galaxy sized accelerators. Biologists get stuck to the belief that everything is just chemistry and electromagnetism. The list continues. The extreme narrowing of the cone of attention to the personal specialization explains much of the power of science but has also lead to the recent situation.

TGD challenges these beliefs and suggests radically new world view: new view about space-time, new view about time, seeing life and consciousness as something fascinating and pointing the way to a generalization of quantum theory. Even pet peeves of serious scientist, such as homeopathy, water memory, and cold fusion have place in this new world view. TGD is certainly not the manner to win friends and influence people in particle physics circles.
  1. TGD challenges general relativity (GRT) by talking aloud about its basic problem - difficulties with the classical conservations laws (energy, momentum, and angular momentum) related to the symmetries of empty Minkowski space of special relativity and lost when space-time becomes curved by the presence of matter.

    Even more irritating, TGD predicts an elegant solution. Identify allowed space-times as 4-D surfaces in certain 8-dimensional space-time M4× CP2. The additional bonus is geometrization of standard model symmetries and classical fields.

    One can of course develop a large number of objections and I have done this. The eventual outcome however is that the idea works and one can understand GRT as a long length scale limit in which the extremely complex topology of what I called many-sheeted space-time is replaced with trivial topology of GRT space-time in long length scales. Much is however lost at this limit, in particular the understanding of living matter and also gravitational physics in galactic length scales.

  2. Many-sheeted space-time forces to replace length scale reductionism with fractality. The implications are far-reaching: strong predictions emerge in all scales and the entire spectrum of length scales provides data allowing to te test TGD. Situation is totally different from that in reductionistic theories and one begins to become aware of the failures of the reductionism: no real bridge between atomic physics and chemistry, between chemistry and biochemistry, between QCD type description and hadron physics, and between hadron physics and nuclear physics. One becomes also aware of the numerous anomalies put under the rug during the last century of reductionistic materialism.

    This information feed plus theory with explanatory power makes of course possible productivity. Internet provides continually information about anomalies and strange findings in various branches of physics, chemistry, biology, neuroscience,.... During more than year the average rate of articles has been rouhgly one per week. Colleagues are of course are thunderstruck. This is unashamed. They are producing with pains the obligatory item in CV once per year or even less often! This guy must be total mad.

  3. TGD forces also a generalization of quantum theory. The hierarchy of Planck constants heff/h=n is one aspect of this generalization and follows from the number theoretic generalization of physics motivated by the need to identify physical correlates of cognition. This is achieved by introducing p-adic number fields labelled by primes and by fusion them to what I call adelic physics. In adelic physics n corresponds to the dimension of extension of rationals characterizing the adele. Evolution reduces to the hierarchy of extensions of rationals. As a matter of fact, I did not end up with this generalization from theory of consciousness but from particle physics by realizing that p-adic physics allows to replace the phenomenological description of particle massivation using Higgs mechanism with what I call p-adic thermodynamics.

    Second aspect of generalization is zero energy ontology (ZEO). This modifies the standard ontology in which physical state is time=constant snapshot of time evolution. In ZEO physical states are zero energy states, which can be seen as physical events: pairs of initial and final states connected by deterministic time evolution: at space-time level preferred extremal. This generalization solves the basic problem of quantum measurement theory and leads to a distinction between experienced time and geometric time. They are closely correlated but not same. This gives also rise to a theory of consciousness: observer, which is outsider in standard quantum theory becomes a self, fundamental notion of new quantum physics, which can be characterized as generalized Zeno effect or series of analog for a sequence of weak measurements. Even the death of self has precise characterization in ZEO. Needless to say that this kind of claims take colleagues to the brink of madness.

    What is remarkable that this notion of quantum state has a close resemblance to the notion of function in biology, the notion of behavior in neuroscience, and the notion of program in computer science. Free will would select between deterministic programs rather than breaking laws of physics by interfering deterministic time evolution. This gives hopes that the fields now in strong phase of development might find TGD.

    This picture opens floodgates for applications to quantum biology, neuroscience, and consciousness. Totally new era in these fields of science could begin but their is a little problem - colleagues. They simply refuse to listen. This is like domestic dispute at worst: the other party simply puts fingers in ears , makes faces, and only repeats "I do not listen!".

In this kind of situation the question is how to communicate. Is it possible to communicate at all?

Should I write "real" books?

Last night I pondered again the problem how to communicate TGD. The reason was that I was again asked once again to write a book about the material at my homepage - something like 17 long books about TGD proper, its applications to various branches of hard science like physics and even chemistry, about TDG inspired theory of consciousness and the emerging new world view, and about TGD inspired quantum biology and neuroscience. There are also numerous articles, which can be found in Research Gate and are published in journals edited by Huping Hu.

This kind of requests are flattering but I am 67 years old and at this age one cannot expect to have too many active years anymore. I want to use these years to do just what I see most relevant for TGD: to develop it further and articulate it as precisely as I ever can. Communication of TGD must be left to others or be achieved with the help of others.

The reason is that writing a book does not mean just collecting the material. There are colleagues doing their best to prevent the publication: the most imaginative attack last time was accusations about self plagiary. The core idea of the trick was that I have written about TGD at my homepage and the material at TGD home page is clearly about TGD about which I am writing a book. Therefore I have perpetrated to the crime of self plagiary.

Also the endless text editing to make it consistent with the requirements of publisher is quite too time consuming and frustrating taking into account how primitive the tools available are. In good old times publisher did this kind of things themselves (in fact Springer does this still as I learned while publishing a text about adelic physics).

For these reasons I have said 'No' but I feel uneasy about my unwillingness to co-operate. Communication might be also quite too premature taking into account the situation in theoretical physics, where the people possibly understanding something about the mathematics and physics of TGD can be found.

Are there other means of communication?

Are there any other means to communicate TGD than using a lot of precious time to editing and fighting with hostile colleagues? Particle theoreticians would have excellent prerequisites to learn TGD but it has become clear that particle physics community is not mature to receive the message for the simple reason that they would be forced to admit that they have been on wrong track for more than four decades. This is simply too embarrassing for them.

There are of course also other branches of physics. In condensed matter physics the notion of tensor net finding in TGD a topological realization in terms of many-sheeted space-time is fashionable, and TGD provides simple models for various strange experimental discoveries: high temperature super-conductivity is one example.

In experimental biology and neuroscience there is an intensive period of progress taking place and since theoretical biology and neuroscience are virtually non-existent, I dare guess that people are ready to consider also ideas, which do not fit to the standard reductionistic biology -as- nothing-but-biochemistry framework. Bio- and neuro-technology are making rapid progress and here the control of academic belief system is even weaker. AI is second field, where there is readiness for revolution even if it would challenge the materialistic dogma of AI. The problem is that these people talk very different language and are pragmatic people not too interested in philosophical delicacies crucial for TGD inspired quantum biology and consciousness. This communication problem is basically created by the reductionistic view applied in science education: people become specialists in very narrow fields.

It seems that the only imaginable-to-me option is to find a person understanding theoretical physics, willing to learn TGD, and ready to prepare books from the material at my homepage (there are something like 17 long books about TGD and related applications plus articles). Using the tools provided by AI it should be easy to collect shorter books about specific topics from this material.

This would however require publisher and allowance to publish in prestigious forums. I am afraid that many colleagues would do their best to prevent the publishing. The reason is not that TGD is something new but that it works alarmingly well and could demonstrate that particle physics community has been on wrong track for these cursed more than four decades. This kind of humiliation particle physicist refuses to swallow. I am afraid that it takes decades before it becomes possible to admit the defeat. Also funding is needed to hire a person to perform the editing. This is impossible since the official academic world sees me as an enemy.

Therefore it seems that I must continue as hitherto - doing my best to polish and develop TGD. I dare trust that when I am not here anymore, there are intelligent people realizing the importance of my lifework and they will act accordingly. Posthumous communication might be even easier since it is not much point to misuse the formal academic power and refuse receiving the message if this does not insult its sender anymore. This might sound melodramatic but I want to emphasize that I do not feel myself as martyr: this option is probably the best one since it gives the needed maximal freedom because there is no fear about the loss of face. And the release of adrenaline induced by the behaviors of colleagues is an excellent intellectual stimulant: peeved brain thinks very clearly!

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Wednesday, April 18, 2018

New insights about quantum criticality for twistor lift inspired by analogy with ordinary criticality

Quantum criticality (QC) is one of the basic ideas of TGD. Zero energy ontology (ZEO) is second key notion and leads to a theory of consciousness as a formulation of quantum measurement theory solving the basic paradox of standard quantum measurement theory, which is usualy tried to avoid by introducing some "interpretation".

ZEO allows to see quantum theory could be seen as "square root" of thermodynamics. It occurred to me that it would be interesting to apply this vision in the case of quantum criticality to perhaps gain additional insights about its meaning. We have a picture about criticality in the framework of thermodynamics: what would be the analogy in ZEO based interpretation of Quantum TGD? Could it help to understand more clearly the somewhat poorly understood views about the notion of self, which as a quantum physical counterpart of observer becomes in ZEO a key concept of fundamental physics?

The basic ingredients involved are discrete coupling constant evolution, zero energy ontology (ZEO) implying that quantum theory is analogous to "square root" of thermodynamics, self as generalized Zeno effect as counterpart of observer made part of the quantum physical system, M8-M4× CP2 duality, and quantum criticality. A further idea is that vacuum functional is analogous to a thermodynamical partition function as exponent of energy E= TS-PV.

The correspondence rules are simple. The mixture of phases with different 3-volumes per particle in a critical region of thermodynamical system is replaced with a superposition of space-time surfaces of different 4-volumes assignable to causal diamonds (CDs) with different sizes. Energy E is replaced with action S for preferred extremals defining Kähler function in the "world of classical worlds" (WCW). S is sum of Kähler action and 4-volume term, and these terms correspond to entropy and volume in the generalization E= TS-PV → S. P resp. T corresponds to the inverse of Kähler coupling strength αK resp. cosmological constant Λ. Both have discrete spectrum of values determined by number theoretically determined discrete coupling constant evolution. Number theoretical constraints force the analog of micro-canonical ensemble so that S as the analog of E is constant for all 4-surfaces appearing in the quantum superposition. This implies quantization rules for Kähler action and volume, which are very strong since αK is complex.

This kind of quantum critical zero energy state is created in unitary evolution created in single step in the process defining self as a generalized Zeno effect. This unitary process implying time delocalization is followed by a weak measurement reducing the state to a fixed CD so that the clock time idenfified as the distance between its tips is well-defined. The condition that the action is same for all space-time surfaces in the superposition poses strong quantization conditions between the value of Kähler action (Kähler coupling strength is complex) and volume term proportional to cosmological constant. The outcome is that after sufficiently large number of steps no space-time surfaces satisfying the conditions can be found, and the first reduction to the opposite boundary of CD must occur - self dies. This is the classical counterpart for the fact that eventually all state function reduction leaving the members of state pairs at the passive boundary of CD invariant are made and the first reduction to the opposite boundary remains the only option.

The generation of magnetic flux tubes provides a manner to satisfy the constancy conditions for the action so that the existing phenomenology as well as TGD counterpart of cyclic cosmology as re-incarnations of cosmic self follows as a prediction.

This picture generalizes to the twistor lift of TGD and cosmology provides an interesting application. One ends up with a precise model for the p-adic coupling constant evolution of the cosmological constant Λ explaining the positive sign and smallness of Λ in long length scales as a cancellation effect for M4 and CP2 parts of the Kähler action for the sphere of twistor bundle in dimensional reduction, a prediction for the radius of the sphere of M4 twistor bundle as Compton length associated with Planck mass (2π times Planck length), and a prediction for the p-adic coupling constant evolution for Λ and coupling strength of M4 part of Kähler action giving also insights to the CP breaking and matter antimatter asymmetry. The observed two values of Λ could correspond to two different p-adic length scales differing by a factor of 21/2.

See the article The Recent View about Twistorialization in TGD Framework of "Towards M-matrix" or the shorter article New insights about quantum criticality for twistor lift inspired by analogy with ordinary criticality.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Monday, April 16, 2018

Did you think that star formation is understood?

In Cosmos Magazine there is an interesting article about about the work of a team of astronomers led by Fatemeh Tabatabaei published in Nature Astronomy.

The problem is following. In the usual scenario for the star formation the stars would have formed almost instantaneously and star formation would not continue anymore significantly. Stars with the age of our sun however exist and star formation is still taking place: more than one half of galaxies is forming stars. So called starburst galaxies do this very actively. The standard story is that since stars explode as supernovae, the debris from supernovae condenses to stars of later generations. Something like this certainly occurs but this does not seem to be the whole story.

Remark: It seems incredible that astrophysics would still have unsolved problems at this level. During years I have learned that standard reductionistic paradigm is full of holes.

The notion of star formation quenching has been introduced: it would slow down the formation of stars. It is known that quenched galaxies mostly have a super-massive blackhole in their center and that quenching starts at the centers of galaxies. Quenching would preserve star forming material for future generations of stars.

To study this process a team of astronomers led by Tabatabaei turned their attention to NCG 1079 located at distance of 45 million light years. It is still forming stars in central regions but shows signs of quenching and has a super-massive blackhole in its center. What was found that large magnetic fields, probably enhanced by the central black hole, affect the gas clouds that would normally collapse into stars, thereby inhibiting their collapse. These forces can even break big clouds into smaller ones, she says, ultimately leading to the formation of smaller stars.

This is highly interesting from TGD point of view. I have already considered a TGD based model for star formation (see this). In the simplest TGD based model galaxies are formed as knots of long cosmic strings. Stars in turn would be formed as sub-knots of these galactic knots. There is also alternative vision in which knots are just closed flux tubes bound to long strings containing galaxies as closed flux tubes like pearls in necklace. These closed flux tubes could emerge from long string by reconnection and form elliptic galaxies. The signature would be non-flatness for the velocity spectrum of distant stars. Also in the case of stars similar reconnection process splitting star as sub-knot of galactic string can be imagined.

If stars are sub-knots in knots of galactic string representing the galaxies, the formation of star would correspond to a formation of knot. This would involve reconnection process in which some portions of knot go "through each other". This is the manner how knots are reduced to trivial knot in knot cobordism used to construct knot invariants in knot theory (see this). Now it would work in opposite direction: to build a knots.

This process is rather violent and would initiate star formation with dark matter from the cosmic string forming the star. This process would continue forever and would allow avoid the instantaneous transformation of matter into stars as in the standard model. At deeper level star formation would be induced by a process taking place at the level of dark matter for magnetic flux tubes: similar vision applies in TGD inspired biology. One could perhaps see these knots as seeds of a phase transition like process leading to a formation of star. This reconnection process could take place also in the formation of spiral galaxies. In Milky Way there are indeed indications for the reconnection process, which could be related to the formation of Milky as knot.

The role of strong magnetic fields supposed to be amplified by the galactic blackhole is believed to be essential in quenching. They would be associated with dark flux tubes, possibly as return fluxes at ordinary space-time sheets carrying visible matter (flux lines must be closed). These magnetic fields would somehow prevent the collapse of gas clouds to stars. They could also induce a splitting of the gas cloud to smaller clouds. The ratio of mass to magnetic flux ratio for clouds is studied and the clouds are found to be magnetically critical or stable against collapse to a core regions needed for the formation of star. The star formation efficiency of clouds drops with increasing magnetic field strength.

Star formation would begin as the magnetic field has strength below a critical value. If the reconnection plays a role in the process, this would suggest that reconnection is probable for magnetic field strengths below critical value. Since the thickness of the magnetic flux tube associated with its M4projection increases when magnetic field strength decreases, one can argue that the reconnection probability increases so that star formation becomes more probable. The development of galactic blackhole would amplify the magnetic fields. During cosmic evolution the flux tubes would thicken so that also the field strength would be reduced and eventually the star formation would begin if the needed gas clouds are present. At distant regions the thickness of flux tube loops can be argued to be larger since the p-adic length scale in question is longer since magnetic field strength is expected to scale like inverse of p-adic length scale squared (also larger value for heff/h=n would imply this). This would explain star formation in distant regions. This is just what observations tell.

A natural model for the galactic blackhole is as a highly wounded portion of cosmic string. The blackhole Schwartschild radius would be R=2GM and the mass due to dark energy of string (there would be also dark matter contribution) to mass would be M≈ TL, where T is roughly T≈ 2-11. This would give the estimate L≈ 210R.

See the article Five new strange effects associated with galaxies or the chapter TGD and astrophysics of "Physics in Many-sheeted Space-time".

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.


Sunday, April 15, 2018

How molecules in cells "find" one another and organize into structures?

The title of the popular article How molecules in cells 'find' one another and organize into structures expresses an old problem of biology. Now the group led by Amy S. Gladfelter has made experimental progress in this problem. The work has been published in Science (see this).

It is reported that RNA molecules recognize each other to condense into the same droplet due to the specific 3D shapes that the molecules assume. Molecules with complementary base pairing can find each other and only similar RNAs condense on same droplet. This brings in mind DNA replication, transcription and translation. Furthermore, the same proteins that form liquid droplets in healthy cells, solidify in diseases like neurodegenerative disorders.

Some kind of phase transition is involved with the process but what brings the molecules together remains still a mystery. The TGD based solution of this mystery is one of the first applications of the notion of many-sheeted space-time in biology, and relies on the notion of magnetic flux tubes connecting molecules to form networks.

Consider first TGD based model about condensed and living matter. As a matter fact, the core of this model applies in all scales. What is new is there are not only particles but also bonds connecting them. In TGD they are flux tubes which can carry dark particles with nonstandard value heff/h=n of Planck constant. In ER-EPR approach in fashion they would be wormholes connecting distance space-time regions. In this case the problem is instability: wormholes pinch and split. In TGD monopole magnetic flux takes care of the stability topologically.

The flux tube networks occur in all scales but especially important are biological length scales.

  1. In chemistry the flux tubes are associated with valence bonds and hydrogen bonds (see this). In biology genetic code would be realized as dark nuclei formed by sequences of dark protons at magnetic flux tubes. Also RNA, amino-acids, and even tRNA could have dark counterparts of this kind (see this). Dark variants of biomolecules would serve as templates for their ordinary variants also at the level of dynamics. Biochemistry would be shadow dynamics dictated to high degree by the dark matter at flux tubes.

  2. Dark valence bonds can have quite long length and the outcome is entangled tensor net (see this). These neuronal nets serve as correlates for cognitive mental images in brain (see this) emotional mental images in body (see this). Dark photons propagating along flux tubes (more precisely topological light rays parallel to them) would be the fundamental communication mechanism (see this). Transmitters and nerve pulses would only change the connectedness properties of these nets.

The topological dynamics of flux tubes has two basic mechanisms (I have discussed this dynamics from the point of view of AI here).
  1. Reconnection of flux tubes serves is the first basic mechanism in the dynamics of flux tube networks and would give among other things rise to neural nets. The connection between neurons would correspond basically to flux tube pair which can split by reconnection. Also two flux tube pairs can reconnect forming Y shaped structures. Flux tube pairs could be quite generally associated with long dark hydrogen bonds scaled up by heff/h=n from their ordinary lengths. Flux tube pairs would carry besides dark protons also supra phases formed by the lone electron pairs associated quite generally with hydrogen bonding atoms. Also dark ions could appear at flux tubes.

    Biomolecules would have flux loops continually scanning the environment and reconnecting if they meet another flux loop. This however requires that magnetic field strengths are same at the two loops so that a resonance is achieved at level of dark photon communications. This makes possible recognition by cyclotron frequency spectrum serving as signature of the magnetic body of the molecule.

    Water memory (see this) would rely on this recognition mechanism based on cyclotron frequencies and also immune system would use it at basic level (here one cannot avoid saying something about homeopathy although I know that this spoils the day of the skeptic: the same mechanism would be involved also with it). For instance, dark DNA strand accompanying ordinary DNA and dark RNA molecules find each other by this mechanism (see this). Same applies to other reactions such as replication and translation .

  2. Shortening of the flux tubes heff/h reducing phase transition is second basic mechanism explaining how biomolecules can find each other in dense molecular soup. It is essential that the magnetic fields at flux tubes are nearly the same for the reconnection to form. A more refined model for the shortening involves two steps: reconnection of flux tubes leading to a formation of flux tube pair between molecules and shortening by heff/h reducing phase transition.

Also ordinary condensed matter phase transitions involve change of the topology of flux tube networks and the model for it allows to put the findings described in the article in TGD perspective.
  1. I just wrote an article (see this) about a solution of two old problems of hydrothermodynamics: the behavior of liquid-gas system in the critical region not consistent with the predictions of statistical mechanics (known already at times of Maxwell!) and the behavior of water above freezing point and in freezing. Dark flux tubes carrying dark protons and possibly electronic Cooper pairs made from so called lone electron pairs characterizing atoms forming hydrogen bonds.

  2. The phase transition from gas to liquid occurs when the number of flux tubes per molecule is high enough. At criticality both phases are in mechanical equilibrium - same pressure. Most interestingly, in solidification the large heff flux tubes transform to ordinary ones and liberate energy: this explains anomalously high latent heats of water and ammonia. The loss of large heff flux tubes however reduces "IQ" of the system.

The phase transitions changing the connectedness of the flux tube networks are fundamental in TGD inspired quantum biology.
  1. Sol-gel transition would correspond to this kind of biological phase transitions. Protein folding (see this) - kind of freezing of protein making it biologically inactive - and unfolding would be second basic example of this transition. The freezing would involve formation of flux tube bonds between points of linear protein and assignable to hydrogen bonds. External perturbations induce melting of the proteins and they become biologically active as the value of heff/h=n characterizing their maximal possible entanglement negentropy content (molecular IQ) increases. External perturbation feeds in energy acting as metabolic energy. I have called this period molecular summer.

  2. Solidification of proteins reducing is reported to be associated with diseases such neurodegenerative disorders. In TGD picture this would reduce the molecular IQ since the ability of system to generate negentropy would be reduced when heff for the flux tubes decreases to its ordinary value. What brings molecules together is not understood and TGD provides the explanation as heff reducing phase transition for flux tube pairs.

See the article How molecules in cells "find" one another and organize into structures? or the chapter Criticality and dark matter of "Hyperfinite factors, p-adic length scale hypothesis and dark matter hierarchy".

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Saturday, April 14, 2018

Maxwell's lever rule and expansion of water in freezing: two poorly understood phenomena

The view about condensed matter as a network with nodes identifiable as molecules and bonds as flux tubes is one of the basic predictions of TGD and obviously means a radical modification of the existing picture. In the sequel two old anomalies of standard physics are explained in this conceptual framework. The first anomaly was known already at the time of Maxwell. In critical region for gas liquid-phase transition van der Waals equation of state fails. Empirically the pressure in critical region depends only on temperature and is independent on molecular volume whereas van der Waals predicting cusp catastrophe type behavior predicts this dependence. This problem is quite general and plagues all analytical models based on statistical mechanics.

Maxwell's area rule and lever rule is the proposed modification of van der Waals in critical region. There are two phases corresponding to liquid and gas in the same pressure and the proportions of the phases vary so that the volume varies.

The lever rule used for metal allows allows to explain the mixture but requires that there are two "elements" involved. What the second "element" is in the case of liquid-gas system is poorly understood. TGD suggests the identification of the second "element" as magnetic flux tubes connecting the molecules. Their number per molecule varies and above critical number a phase transition to liquid phase would take place.

Second old problem relates to the numerous anomalies of water (see the web pages of Martin Chaplin). I have discussed these anomalies from TGD viewpoint in (see this). The most well-known anomalies relate to the behavior near freezing point. Below 4 degrees Celsius water expands rather than contracts as temperature is lowered. Also in the freezing an expansion takes place.

A general TGD based explanation for the anomalies of water would be the presence of also dark phases with non-standard value of Planck constant heff/h=n (see this). Combining this idea with the above proposal this would mean that flux tubes associated with hydrogen bonds can have also non-standard value of Planck constant in which case the flux tube length scales like n. The reduction of n would shorten long flexible flux tubes to short and rigid ones. This reduce the motility of molecules and also force them nearer to each other. This would create empty volume and lead to an increase of volume per molecule as temperature is lowered.

Quite generally, the energy for particles with non-standard value of Planck constant is higher than for ordinary ones (see this). In freezing all dark flux tubes would transform to ordinary ones and the surplus energy would be liberated so that the latent heat should be anomalously high for all molecules forming hydrogen bonds. Indeed, for both water and NH3 having hydrogen bonds the latent heat is anomalously high. Hydrogen bonding is possible if molecules have atoms with lone electron pairs (electrons are not assignable to valence bonds). Lone electron pairs could form Cooper pairs at flux tube pairs assignable to hydrogen bonds and carrying the dark proton. Therefore also high Tc superconductivity could be possible.

See the article Maxwell's lever rule and expansion of water in freezing: two poorly understood phenomena or the chapter Quantum Criticality and Dark Matter of "Hyper-finite factors, p-adic length scale hypothesis, and dark matter hierarchy".

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.

Saturday, April 07, 2018

Complex 8-momenta are necessary for the realization of massless many-particle states implying unitary without loops

I have proposed a realization of unitarity in twistor approach without loops and with discrete coupling constant evolution dictated by number theory (see this). The proposal relies crucially on the identification quantum numbers in M8 picture as like-light quaternionic 8-momenta and the assumption that also many-particle states are massles. The 8-momenta are also complex already at classical level with corresponding imaginary unit i commuting with octonionic imaginary units Ik of M8.

The essential assumption was that the 8-momenta of also many-particle states are light-like. It is easy to see that this cannot make sense if single particle states have light-like 8-momenta unless they are also parallel. For a moment I thought that complexification of single particle 8-momenta might help but it did not.

Next came the realization that BCFW construction actually gives analogs of zero energy states having complex light-like momenta. The single particle momenta are not however light-like anymore. In TGD these states can be assigned with the interior regions of causal diamonds and have interpretation as resonances/bound states with complex momenta. The following tries to articular this more precisely.

  1. In BCFW approach the expression of residue integral as sum of poles in the variable z associated with the amplitude obtained by the deformation pi→ pi+zri of momenta (∑ ri=0, ri• rj=0) leads to a decomposition of the tree scattering amplitude to a sum of products of amplitudes in resonance channels with complex momenta at poles. The products involve 1/P2 factor giving pole and the analog of cut in unitary condition. Proof of tree level unitarity is achieved by using complexified momenta as a mere formal trick and complex momenta are an auxiliary notion. The complex massless poles are associated with groups I of particles whereas the momenta of particles inside I are complex and non-light-like.

  2. Could BCFW deformation give a description of massless bound states massless particles so that the complexification of the momenta would describe the effect of bound state formation on the single particle states by making them non-light-like? This makes sense if one assumes that all 8-momenta - also external - are complex. The classical charges are indeed complex already classically since Kähler coupling strength is complex (see this). A possible interpretation for the imaginary part is in terms of decay width characterizing the life-time of the particle and defining a length of four-vector.

  3. The basic question in the construction of scattering amplitudes is what happens inside CD for the external particles with light-like momenta. The BCFW deformation leading to factorization suggests an answer to the question. The factorized channel pair corresponds to two CDs inside which analogs of M and N-M particle bound states of external massless particles would be formed by the deformation pi→ pi+zri making particle momenta non-light-like. The allowed values of z would correspond to the physical poles. The factorization of BCFW scattering amplitude would correspond to a decomposition to products of bound state amplitudes for pairs of CDs. The analogs of bound states for zero energy states would be in question. BCFW factorization could be continued down to the lowest level below which no factorization is possible.

  4. One can of course worry about the non-uniqueness of the BCFW deformation. For instance, the light-like momenta ri must be parallel (rii r) but the direction of r is free. Also the choice of λi is free to a high extent. BCFW expression for the amplitude as a residue integral over z is however unique. What could this non-uniqueness mean?

    Suppose one accepts the number theoretic vision that scattering amplitudes are representations for sequences of algebraic manipulations. These representations are bound to be highly non-unique since very many sequences can connect the same initial and final expressions. The space-time surface associated with given representation of the scattering amplitude is not unique since each computation corresponds to different space-time surface. There however exists a representation with maximal simplicity.

    Could these two kinds of non-uniqueness relate?

It is indeed easy to see that many-particle states with light-like single particle momenta cannot have light-like momenta unless the single-particle momenta are parallel so that in non-parallel case one must give up light-likeness condition also in complex sense.
  1. The condition of light-likeness in complex sense allows the vanishing of real and imaginary mass squared for individual particles

    Im(pi)= λi Re(pi) ,

    (Re(pi))2=(Im(pi))2=0 .

    Real and imaginary parts are parallel and light-like in 8-D sense.

  2. The remaining two conditions come from the vanishing of the real and imaginary parts of the total mass squared:

    i≠ j Re(pi)• Re(pj)-Im(pi)• Im(pj) =0 ,

    i≠ j Re(pi)• Im(pj)=0 .

    By using proportionality of Re(pi) and Im(pi) one can express the conditions in terms of the real momenta

    i≠ j (1-λiλj) Re(pi)• Re(pj) =0 ,

    i≠ j λj Re(pi)• Re(pj)=0 .

    For positive/negative energy part of zero energy state the sign of time component of momentum is fixed and therefore λi have fixed sign. Suppose that λi have fixed sign. Since the inner products pi• pj of time-like vectors with fixed sign of time compomemet are all positive or negative the second term can vanish only if one has pi•pj=0. If the sign of λi can vary, one can satisfy the condition linear in λi but not the first condition as is easy to see in 2-particle case.

  3. States with light-like parallel 8-momenta are allowed and one can ask whether this kind of states might be realized inside magnetic flux tubes identified as carriers of dark matter in TGD sense. The parallel light-like momenta in 8-D sense would give rise to a state analogous to super-conductivity. Could this be true also for quarks inside hadrons assumed to move in parallel in QCD based model. This also brings in mind the earlier intuitive proposal that the momenta of fermions and antifermions associated with partonic 2-surfaces must be parallel so that the propagators for the states containing altogether n fermions and antifermions would behave like 1/(p2)n/2 and would not correspond to ordinary particles.

See background see the article The Recent View about Twistorialization in TGD Framework.

For a summary of earlier postings see Latest progress in TGD.

Articles and other material related to TGD.