Tuesday, April 26, 2005

The Holographical Mapping of the Standard Model onto the Blackhole Horizon

New paper that came out yesterday written by Gerard 't Hooft

Interactions between outgoing Hawking particles and ingoing matter are determined by gravitational forces and Standard Model interactions. In particular the gravitational interactions are responsible for the unitarity of the scattering against the horizon, as dictated by the holographic principle, but the Standard Model interactions also contribute, and understanding their effects is an important first step towards a complete understanding of the horizon’s dynamics. The relation between in- and outgoing states is described in terms of an operator algebra. In this paper, the first of a series, we describe the algebra induced on the horizon by U(1) vector fields and scalar fields, including the case of an Englert-Brout-Higgs mechanism, and a more careful consideration of the transverse vector field components.


But before I entertain this idea, I wanted to gain some perspective. I was immediately struck by something here that changes the way we have been doing things? Recognizing the blackhole evaporation and standard model production, we are saying that indeed these things already existed in the horizon?

Would M theory have then found it's experimental counterpart? The Bose Nova and Jet idea from collapsing bubbles has been part of the vision I speculated in what Heisenberg saw in the geometrodynamics of a nuclear explosion. See, not only were we detonating a nuclear reaction(gravitational collapse), but we were doing something beyond the perception, by going to the heart of these particle collisions.

What makes it diffuclt for me is that having seen the blackhole dynamics in relation to bubble technlogies, that I like to use as analogies, relate too, and contain the elements of the standard model without ever entering the blackhole? How is this possible and still see the three blane collapse of the blackhole here?

Dimensional Reduction in Quantum Gravity by Gerard 't Hooft


The requirement that physical phenomena associated with gravitational collapse should be duly reconciled with the postulates of quantum mechanics implies that at a Planckian scale our world is not 3+1 dimensional. Rather, the observable degrees of freedom can best be described as if they were Boolean variables defined on a two-dimensional lattice, evolving with time. This observation, deduced from not much more than unitarity, entropy and counting arguments, implies severe restrictions on possible models of quantum gravity. Using cellular automata as an example it is argued that this dimensional reduction implies more constraints than the freedom we have in constructing models. This is the main reason why so-far no completely consistent mathematical models of quantum black holes have been found.

Essay dedicated to Abdus Salam.


Gerard "t Hooft:No 'Quantum Computer' will ever be able to out perform a 'scaled up classical computer.'

Holding onto the sanity of why such computerization program will run into difficulties, has not undermined the position to included and create opportunities for seeing what is happening at such reductionistic levels? To have wondered, will we gain a dynamcial visulaization of what is happening within the context of the universe as it came into being?

With more computer power, scientists can also include more elements of the Earth's climate system, such as the oceans, the atmosphere, their chemistry and the carbon cycle.

This should make forecasts of future temperature rises more reliable. Keiko Takahashi, who works at the Earth Simulator Centre, says they have already carried out several experiments that look 50 years ahead.



There is difficulties with doing this, and like LIGO or a SEti work in progress, how shall this information allows us to see the interactions in a consistent model? So dealing with these difficulties has been part of Gerard 't Hoofts analysis to see that others too, work hard to deal with issues of information paradox?

Part of this difficulty in computerized model application, would have been transfer rates of information from such quantum levels. Lubos gives some insight here. Although it has been very nice that such visualization techiques could be applied to this data transfer, from what we understand of particle reductionsism. Within context of the larger universe, how detailed shall has our observations become of the world around us?


These images contrast the degree of interaction and collective motion, or "flow," among quarks in the predicted gaseous quark-gluon plasma state (Figure A, see mpeg animation) vs. the liquid state that has been observed in gold-gold collisions at RHIC (Figure B, see mpeg animation). The green "force lines" and collective motion (visible on the animated version only) show the much higher degree of interaction and flow among the quarks in what is now being described as a nearly "perfect" liquid. (Courtesy of Brookhaven National Laboratory)


The goal of the Large Hadron Collider (LHC) is to link roughly 6,000 scientists so they can perform large-scale experiments and simulations to help the world better understand subatomic particles. The grid will ultimately link more than 200 research institutions.

"This service challenge is a key step on the way to managing the torrents of data anticipated from the LHC," Jamie Shiers, manager of the service challenges at CERN, said in a statement. "When the LHC starts operating in 2007, it will be the most data-intensive physics instrument on the planet, producing more than 1,500 megabytes of data every second for over a decade."


Gerard 't Hooft recognized this problem and when we see such scattering ideas from blackhole standard model production particles, how shall we see this event in terms of what is sent back for examination? It would mean considering the context of Gerard's paper there is no information loss? No missing energy events?

Thus the consistent model frame, from blackhole production underlying framework would disavow any ideas relating to energy in and energy out imbalance held in context of gravitonic production as part of the standard model production? The horizon area would become a balanced view?

Using the ideas of Clementine and the graduation to Grace, it seemed that I was leading to a good comprehensive view of the bubble technicalities as they contained the missing energy, but moving too "this view of Gerard's" might endanger how we approximate the whole view of this missing energy, with the easy removal of that missing energy scenario? Would this be consistent with the overall encompassed view that the grvaiton has emerged from the extension of this standard model to say oh, it's okay we can remove this and fnd comfort with the existing framework without other contentions issues like missing energy to deal with this?

Do we have Proof of this Missing Energy? If the answer is yes, then the issue has not been resolved?

No comments:

Post a Comment