Showing posts sorted by relevance for query lorentz. Sort by date Show all posts
Showing posts sorted by relevance for query lorentz. Sort by date Show all posts

Tuesday, February 16, 2010

Article From New York Times and More




Brookhaven National Laboratory

HOT A computer rendition of 4-trillion-degree Celsius quark-gluon plasma created in a demonstration of what scientists suspect shaped cosmic history.

In Brookhaven Collider, Scientists Briefly Break a Law of Nature

The Brookhaven scientists and their colleagues discussed their latest results from RHIC in talks and a news conference at a meeting of the American Physical Society Monday in Washington, and in a pair of papers submitted to Physical Review Letters. “This is a view of what the world was like at 2 microseconds,” said Jack Sandweiss of Yale, a member of the Brookhaven team, calling it, “a seething cauldron.”

Among other things, the group announced it had succeeded in measuring the temperature of the quark-gluon plasma as 4 trillion degrees Celsius, “by far the hottest matter ever made,” Dr. Vigdor said. That is 250,000 times hotter than the center of the Sun and well above the temperature at which theorists calculate that protons and neutrons should melt, but the quark-gluon plasma does not act the way theorists had predicted.

Instead of behaving like a perfect gas, in which every quark goes its own way independent of the others, the plasma seemed to act like a liquid. “It was a very big surprise,” Dr. Vigdor said, when it was discovered in 2005. Since then, however, theorists have revisited their calculations and found that the quark soup can be either a liquid or a gas, depending on the temperature, he explained. “This is not your father’s quark-gluon plasma,” said Barbara V. Jacak, of the State University at Stony Brook, speaking for the team that made the new measurements.

It is now thought that the plasma would have to be a million times more energetic to become a perfect gas. That is beyond the reach of any conceivable laboratory experiment, but the experiments colliding lead nuclei in the Large Hadron Collider outside Geneva next winter should reach energies high enough to see some evolution from a liquid to a gas.
See more at above link.

***

Violating Parity with Quarks and Gluons
by Sean Carroll of Cosmic Variance
This new result from RHIC doesn’t change that state of affairs, but shows how quarks and gluons can violate parity spontaneously if they are in the right environment — namely, a hot plasma with a magnetic field.

So, okay, no new laws of physics. Just a much better understanding of how the existing ones work! Which is most of what science does, after all
.

***

Quark–gluon plasma

From Wikipedia, the free encyclopedia

A QGP is formed at the collision point of two relativistically accelerated gold ions in the center of the STAR detector at the relativistic heavy ion collider at the Brookhaven national laboratory.


A quark-gluon plasma (QGP) or quark soup[1] is a phase of quantum chromodynamics (QCD) which exists at extremely high temperature and/or density. This phase consists of (almost) free quarks and gluons, which are the basic building blocks of matter. Experiments at CERN's Super Proton Synchrotron (SPS) first tried to create the QGP in the 1980s and 1990s: the results led CERN to announce indirect evidence for a "new state of matter"[2] in 2000. Current experiments at Brookhaven National Laboratory's Relativistic Heavy Ion Collider (RHIC) are continuing this effort.[3] Three new experiments running on CERN's Large Hadron Collider (LHC), ALICE,[4] ATLAS and CMS, will continue studying properties of QGP.

Contents

  • 1 General introduction


    • 1.1 Why this is referred to as "plasma"
    • 1.2 How the QGP is studied theoretically
    • 1.3 How it is created in the lab
    • 1.4 How the QGP fits into the general scheme of physics
  • 2 Expected properties


    • 2.1 Thermodynamics
    • 2.2 Flow
    • 2.3 Excitation spectrum
  • 3 Experimental situation
  • 4 Formation of quark matter
  • 5 See also
  • 6 References
  • 7 External links

General introduction

The quark-gluon plasma contains quarks and gluons, just as normal (baryonic) matter does. The difference between these two phases of QCD is that in normal matter each quark either pairs up with an anti-quark to form a meson or joins with two other quarks to form a baryon (such as the proton and the neutron). In the QGP, by contrast, these mesons and baryons lose their identities and dissolve into a fluid of quarks and gluons.[5] In normal matter quarks are confined; in the QGP quarks are deconfined.
Although the experimental high temperatures and densities predicted as producing a quark-gluon plasma have been realized in the laboratory, the resulting matter does not behave as a quasi-ideal state of free quarks and gluons, but, rather, as an almost perfect dense fluid.[6] Actually the fact that the quark-gluon plasma will not yet be "free" at temperatures realized at present accelerators had been predicted already in 1984 [7] as a consequence of the remnant effects of confinement. 

Why this is referred to as "plasma"

A plasma is matter in which charges are screened due to the presence of other mobile charges; for example: Coulomb's Law is modified to yield a distance-dependent charge. In a QGP, the color charge of the quarks and gluons is screened. The QGP has other analogies with a normal plasma. There are also dissimilarities because the color charge is non-abelian, whereas the electric charge is abelian. Outside a finite volume of QGP the color electric field is not screened, so that volume of QGP must still be color-neutral. It will therefore, like a nucleus, have integer electric charge.

How the QGP is studied theoretically

One consequence of this difference is that the color charge is too large for perturbative computations which are the mainstay of QED. As a result, the main theoretical tools to explore the theory of the QGP is lattice gauge theory. The transition temperature (approximately 175 MeV) was first predicted by lattice gauge theory. Since then lattice gauge theory has been used to predict many other properties of this kind of matter. The AdS/CFT correspondence is a new interesting conjecture allowing insights in QGP.

How it is created in the lab

The QGP can be created by heating matter up to a temperature of 2×1012 kelvin, which amounts to 175 MeV per particle. This can be accomplished by colliding two large nuclei at high energy (note that 175 MeV is not the energy of the colliding beam). Lead and gold nuclei have been used for such collisions at CERN SPS and BNL RHIC, respectively. The nuclei are accelerated to ultrarelativistic speeds and slammed into each other while Lorentz contracted. They largely pass through each other, but a resulting hot volume called a fireball is created after the collision. Once created, this fireball is expected to expand under its own pressure, and cool while expanding. By carefully studying this flow, experimentalists hope to put the theory to test.

How the QGP fits into the general scheme of physics

QCD is one part of the modern theory of particle physics called the Standard Model. Other parts of this theory deal with electroweak interactions and neutrinos. The theory of electrodynamics has been tested and found correct to a few parts in a trillion. The theory of weak interactions has been tested and found correct to a few parts in a thousand. Perturbative aspects of QCD have been tested to a few percent. In contrast, non-perturbative aspects of QCD have barely been tested. The study of the QGP is part of this effort to consolidate the grand theory of particle physics.
The study of the QGP is also a testing ground for finite temperature field theory, a branch of theoretical physics which seeks to understand particle physics under conditions of high temperature. Such studies are important to understand the early evolution of our universe: the first hundred microseconds or so. While this may seem esoteric, this is crucial to the physics goals of a new generation of observations of the universe (WMAP and its successors). It is also of relevance to Grand Unification Theories or 'GUTS' which seek to unify the four fundamental forces of nature.

Expected properties

Thermodynamics

The cross-over temperature from the normal hadronic to the QGP phase is about 175 MeV, corresponding to an energy density of a little less than 1 GeV/fm3. For relativistic matter, pressure and temperature are not independent variables, so the equation of state is a relation between the energy density and the pressure. This has been found through lattice computations, and compared to both perturbation theory and string theory. This is still a matter of active research. Response functions such as the specific heat and various quark number susceptibilities are currently being computed.

Flow

The equation of state is an important input into the flow equations. The speed of sound is currently under investigation in lattice computations. The mean free path of quarks and gluons has been computed using perturbation theory as well as string theory. Lattice computations have been slower here, although the first computations of transport coefficients have recently been concluded. These indicate that the mean free time of quarks and gluons in the QGP may be comparable to the average interparticle spacing: hence the QGP is a liquid as far as its flow properties go. This is very much an active field of research, and these conclusions may evolve rapidly. The incorporation of dissipative phenomena into hydrodynamics is another recent development that is still in an active stage.

Excitation spectrum

Does the QGP really contain (almost) free quarks and gluons? The study of thermodynamic and flow properties would indicate that this is an over-simplification. Many ideas are currently being evolved and will be put to test in the near future. It has been hypothesized recently that some mesons built from heavy quarks (such as the charm quark) do not dissolve until the temperature reaches about 350 MeV. This has led to speculation that many other kinds of bound states may exist in the plasma. Some static properties of the plasma (similar to the Debye screening length) constrain the excitation spectrum.

Experimental situation

Those aspects of the QGP which are easiest to compute are not the ones which are the easiest to probe in experiments. While the balance of evidence points towards the QGP being the origin of the detailed properties of the fireball produced in the RHIC, this is the main barrier which prevents experimentalists from declaring a sighting of the QGP. For a summary see 2005 RHIC Assessment.
The important classes of experimental observations are

Formation of quark matter

In April 2005, formation of quark matter was tentatively confirmed by results obtained at Brookhaven National Laboratory's Relativistic Heavy Ion Collider (RHIC). The consensus of the four RHIC research groups was that they had created a quark-gluon liquid of very low viscosity. However, contrary to what was at that time still the widespread assumption, it is yet unknown from theoretical predictions whether the QCD "plasma", especially close to the transition temperature, should behave like a gas or liquid[8]. Authors favoring the weakly interacting interpretation derive their assumptions from the lattice QCD calculation, where the entropy density of quark-gluon plasma approaches the weakly interacting limit. However, since both energy density and correlation shows significant deviation from the weakly interacting limit, it has been pointed out by many authors that there is in fact no reason to assume a QCD "plasma" close to the transition point should be weakly interacting, like electromagnetic plasma (see, e.g., [9]).

See also

References


External links

Monday, September 17, 2007

The Gravity Landscape and Lagrange Points

"We all are of the citizens of the Sky" Camille Flammarion


In 1858, by the set of its relations, it will allow Camille Flammarion, the 16 years age, to enter as raises astronomer at the Observatory of Paris under the orders of Urbain the Glassmaker, at the office of calculations.


There is a deep seated need to look beyond ourselves. We tend to look up in space, while there is this greater vision that lies even beyond what we are so used to in our everyday lives.


(Larry Niven's Ringworld, seen from space. Artwork by Harry Frank
Ringworld is a Hugo and Nebula award-winning 1970 science fiction novel by Larry Niven, set in his Known Space universe. The work is widely considered one of the classics of science fiction literature. It is followed by three sequels, and it ties in to numerous other books in the Known Space universe.
.


Our view of space and living beyond the confines of Earth, is lived over in the minds of those who have struggled within science to make these travels possible.

Imagine that first look at the blue planet. How glorious this view, while here we mere mortals look at what those take for granted as they now use the machines they created to visit new planets.

L4 and L5

The L4 and L5 points lie at 60 degrees ahead of and behind Earth in its orbit as seen from the Sun. Unlike the other Lagrange points, L4 and L5 are resistant to gravitational perturbations. Because of this stability, objects tend to accumulate in these points, such as dust and some asteroid-type objects.

A spacecraft at L1, L2, or L3 is ‘meta-stable’, like a ball sitting on top of a hill. A little push or bump and it starts moving away. A spacecraft at one of these points has to use frequent rocket firings or other means to remain in the same place. Orbits around these points are called 'halo orbits'.

But at L4 or L5, a spacecraft is truly stable, like a ball in a bowl: when gently pushed away, it orbits the Lagrange point without drifting farther and farther, and without the need of frequent rocket firings. The Sun's pull causes any object in the L4 and L5 locations to ‘orbit’ the Lagrange point in an 89-day cycle. These positions have been studied as possible sites for artificial space stations in the distant future.


I draw you attention too,"ball sitting on top of a hill" in the previous article. One should get the idea right away, that what was revealed in the possibilities of the landscape, could have correspondences in how we look at the universe in it's gravitational considerations.



Who would have known that such "orbital tendencies" which would have seem so chaotic, could have let one seen Lissajous's by design. You might think of Lorentz's butterfly flapping it's wings, yet such probabilities are held "to a spot" that is a result of placement within the very nature of the landscape of the gravitational cosmos.

So would you have wondered, "if we considered "E8" as a dimensional attribute of such probabilities "by design in such a place" what would this place look like? Would you have selecedt the probability of this resulting ball to falling in the respective valley as a, moduli form?

Sunday, June 27, 2010

Virasoro algebra

Black hole thermodynamics

From Wikipedia, the free encyclopedia

In physics, black hole thermodynamics is the area of study that seeks to reconcile the laws of thermodynamics with the existence of black hole event horizons. Much as the study of the statistical mechanics of black body radiation led to the advent of the theory of quantum mechanics, the effort to understand the statistical mechanics of black holes has had a deep impact upon the understanding of quantum gravity, leading to the formulation of the holographic principle.

 It is important that ones is able to see the progression from abstraction to a interpretation of foundational approach.

***



Andy Strominger:
This was a field theory that lived on a circle, which means it has one spatial dimension and one time dimension. We derived the fact that the quantum states of the black hole could be represented as the quantum states of this one-plus-one dimensional quantum field theory, and then we counted the states of this theory and found they exactly agreed with the Bekenstein-Hawking entropy.See:Quantum Microstates: Gas Molecules in the Presence of a Gravitational Field

See:Microscopic Origin of the Bekenstein-Hawking Entropy

Of course I am interested the mathematical framework as it might be compared to some phenomenological approach that gives substance to any theoretical thought.

For example, Tommaso Dorigo is a representative of the type of people who may affect the general distribution of "subjects" that may grow at CERN or the Fermilab in the next decade or two. And he just published a quote by Sherlock Holmes - no kidding - whose main point is that it is a "capital mistake" to work on any theory before the data are observed.See:Quantum gravity: minority report

I think you were a little harsh on Tommaso Dorigo  Lubos because he is really helping us to understand the scientific process at Cern. But you are right about theory in my mind, before the phenomenological approach can be seen. The mind need to play creatively in the abstract notions before it can be seen in it's correlations in reality.

***

Virasoro algebra

From Wikipedia, the free encyclopedia

Jump to: navigation, search
Group theory
Rubik's cube.svg
Group theory
In mathematics, the Virasoro algebra (named after the physicist Miguel Angel Virasoro) is a complex Lie algebra, given as a central extension of the complex polynomial vector fields on the circle, and is widely used in string theory.

Contents


Definition

The Virasoro algebra is spanned by elements
Li for i\in\mathbf{Z}
and c with
Ln + L n
and c being real elements. Here the central element c is the central charge. The algebra satisfies
[c,Ln] = 0
and
[L_m,L_n]=(m-n)L_{m+n}+\frac{c}{12}(m^3-m)\delta_{m+n}.
The factor of 1/12 is merely a matter of convention.
The Virasoro algebra is a central extension of the (complex) Witt algebra of complex polynomial vector fields on the circle. The Lie algebra of real polynomial vector fields on the circle is a dense subalgebra of the Lie algebra of diffeomorphisms of the circle.
The Virasoro algebra is obeyed by the stress tensor in string theory, since it comprises the generators of the conformal group of the worldsheet, obeys the commutation relations of (two copies of) the Virasoro algebra. This is because the conformal group decomposes into separate diffeomorphisms of the forward and back lightcones. Diffeomorphism invariance of the worldsheet implies additionally that the stress tensor vanishes. This is known as the Virasoro constraint, and in the quantum theory, cannot be applied to all the states in the theory, but rather only on the physical states (confer Gupta-Bleuler quantization).

Representation theory

A lowest weight representation of the Virasoro algebra is a representation generated by a vector v that is killed by Li for i ≥1 , and is an eigenvector of L0 and c. The letters h and c are usually used for the eigenvalues of L0 and c on v. (The same letter c is used for both the element c of the Virasoro algebra and its eigenvalue.) For every pair of complex numbers h and c there is a unique irreducible lowest weight representation with these eigenvalues.
A lowest weight representation is called unitary if it has a positive definite inner product such that the adjoint of Ln is Ln. The irreducible lowest weight representation with eigenvalues h and c is unitary if and only if either c≥1 and h≥0, or c is one of the values
 c = 1-{6\over m(m+1)} = 0,\quad 1/2,\quad 7/10,\quad 4/5,\quad 6/7,\quad 25/28, \ldots
for m = 2, 3, 4, .... and h is one of the values
 h = h_{r,s}(c) = {((m+1)r-ms)^2-1 \over 4m(m+1)}
for r = 1, 2, 3, ..., m−1 and s= 1, 2, 3, ..., r. Daniel Friedan, Zongan Qiu, and Stephen Shenker (1984) showed that these conditions are necessary, and Peter Goddard, Adrian Kent and David Olive (1986) used the coset construction or GKO construction (identifying unitary representations of the Virasoro algebra within tensor products of unitary representations of affine Kac-Moody algebras) to show that they are sufficient. The unitary irreducible lowest weight representations with c < 1 are called the discrete series representations of the Virasoro algebra. These are special cases of the representations with m = q/(pq), 0<r<q, 0< s<p for p and q coprime integers and r and s integers, called the minimal models and first studied in Belavin et al. (1984).
The first few discrete series representations are given by:
  • m = 2: c = 0, h = 0. The trivial representation.
  • m = 3: c = 1/2, h = 0, 1/16, 1/2. These 3 representations are related to the Ising model
  • m = 4: c = 7/10. h = 0, 3/80, 1/10, 7/16, 3/5, 3/2. These 6 representations are related to the tri critical Ising model.
  • m = 5: c = 4/5. There are 10 representations, which are related to the 3-state Potts model.
  • m = 6: c = 6/7. There are 15 representations, which are related to the tri critical 3-state Potts model.
The lowest weight representations that are not irreducible can be read off from the Kac determinant formula, which states that the determinant of the invariant inner product on the degree h+N piece of the lowest weight module with eigenvalues c and h is given by
  A_N\prod_{1\le r,s\le N}(h-h_{r,s}(c))^{p(N-rs)}
which was stated by V. Kac (1978), (see also Kac and Raina 1987) and whose first published proof was given by Feigin and Fuks (1984). (The function p(N) is the partition function, and AN is some constant.) The reducible highest weight representations are the representations with h and c given in terms of m, c, and h by the formulas above, except that m is not restricted to be an integer ≥ 2 and may be any number other than 0 and 1, and r and s may be any positive integers. This result was used by Feigin and Fuks to find the characters of all irreducible lowest weight representations.

Generalizations

There are two supersymmetric N=1 extensions of the Virasoro algebra, called the Neveu-Schwarz algebra and the Ramond algebra. Their theory is similar to that of the Virasoro algebra.
The Virasoro algebra is a central extension of the Lie algebra of meromorphic vector fields on a genus 0 Riemann surface that are holomorphic except at two fixed points. I.V. Krichever and S.P. Novikov (1987) found a central extension of the Lie algebra of meromorphic vector fields on a higher genus compact Riemann surface that are holomorphic except at two fixed points, and M. Schlichenmaier (1993) extended this to the case of more than two points.

History

The Witt algebra (the Virasoro algebra without the central extension) was discovered by E. Cartan (1909). Its analogues over finite fields were studied by E. Witt in about the 1930s. The central extension of the Witt algebra that gives the Virasoro algebra was first found (in characteristic p>0) by R. E. Block (1966, page 381) and independently rediscovered (in characteristic 0) by I. M. Gelfand and D. B. Fuks (1968). Virasoro (1970) wrote down some operators generating the Virasoso algebra while studying dual resonance models, though he did not find the central extension. The central extension giving the Virasoro algebra was rediscovered in physics shortly after by J. H. Weis, according to Brower and Thorn (1971, footnote on page 167).

***

Friday, January 22, 2010

Historical Figures Lead Us to the Topic of Entanglement

The Solvay Congress of 1927

We regard quantum mechanics as a complete theory for which the fundamental physical and mathematical hypotheses are no longer susceptible of modification.

--Heisenberg and Max Born, paper delivered to Solvay Congress of 1927

You know I have watched the long drawn out conversation on Backreaction about what was once already debated, to have advanced to current status in the world represented as a logic orientated process with regard to entanglement.

What are it's current status in terms of its expression experimentally to know what it is we are doing with something that had been debated long ago?



Solvay Physics Conference 1927 02:55 - 2 years ago

The most known people who participated in the conference were Ervin Schrodinger, Niels Bohr, Werner Heisenberg, Auguste Piccard, Paul Dirac, Max Born, Wolfgang Pauli, Louis de Broglie, Marie Curie, Hendrik Lorentz, Albert Einstein and others. The film opens with quick shots of Erwin Schrodinger and Niels Bohr. Auguste Piccard of the University of Brussels follows and then the camera re-focuses on Schrodinger and Bohr. Schrodinger who developed wave mechanics never agreed with Bohr on quantum mechanics. Solvay gave Heisenberg an opportunity to discuss his new uncertainty principle theory. Max Born's statistical interpretation of the wave function ended determinism in atomic world. These men - Bohr, Heisenberg, Kramers, Dirac and Born together with Born represent the founding fathers of quantum mechanics. Louis de Broglie wrote his dissertation on the wave nature of matter which Schrodinger used as basis for wave mechanics. Albert Einstein whose famous response to Born's statistical interpretation of wave function was "God does not play dice." Twenty-nine physicists, the main quantum theorists of the day, came together to discuss the topic "Electrons and Photons". Seventeen of the 29 attendees were or became Nobel Prize winners. Following is a "home movie" shot by Irving Langmuir, (the 1932 Nobel Prize winner in chemistry). It captures 2 minutes of an intermission in the proceedings. Twenty-one of the 29 attendees are on the film. --- It's Never too Late to Study: http://www.freesciencelectures.com/ --- Notice: This video is copyright by its respectful owners. The website address on the video does not mean anything. ---

***

The Einstein-Podolsky-Rosen Argument in Quantum Theory

First published Mon May 10, 2004; substantive revision Wed Aug 5, 2009

In the May 15, 1935 issue of Physical Review Albert Einstein co-authored a paper with his two postdoctoral research associates at the Institute for Advanced Study, Boris Podolsky and Nathan Rosen. The article was entitled “Can Quantum Mechanical Description of Physical Reality Be Considered Complete?” (Einstein et al. 1935). Generally referred to as “EPR”, this paper quickly became a centerpiece in the debate over the interpretation of the quantum theory, a debate that continues today. The paper features a striking case where two quantum systems interact in such a way as to link both their spatial coordinates in a certain direction and also their linear momenta (in the same direction). As a result of this “entanglement”, determining either position or momentum for one system would fix (respectively) the position or the momentum of the other. EPR use this case to argue that one cannot maintain both an intuitive condition of local action and the completeness of the quantum description by means of the wave function. This entry describes the argument of that 1935 paper, considers several different versions and reactions, and explores the ongoing significance of the issues they raise.

Might I confuse you then to see that their is nothing mystical about what our emotive states implore, that we might not also consider the purpose of Venn Logic, or, a correlation to Fuzzy logic to prepare the way for how we can become emotive entangled in our psychology, are ways "biologically mixed with our multilevel perspective" about how photons interact, to see that such a color of debate could have amounted to a distinction that arises from within. Which can manifest itself on a real world stage that is psychological forced out of the confines of human emotion, to be presented as a real world force "bridle or unbridled" with regard to the human condition?

See :


  • Entanglement Interpretation of Black Hole Entropy 


  • See Also:Backreaction: Testing the foundations of quantum mechanics

    Thursday, May 29, 2008

    The Plane of Simultaneity

    This blog entry was constructed to reply to the conversation that is going on in the issue of the "Block Universe."


    See:

  • Penrose and Quanglement

  • Entanglement and the New Physics




  • In the past, teleportation has only been possible with particles of light Image: Rainer Blatt



    It's useless sometimes to just lay there while these thoughts accumulate in one's mind, as one weaves together the picture that is forming, and whence it come from this unification process, and after a time, one then thinks about the abilities of mind to gather and consolidate.


    By taking advantage of quantum phenomena such as entanglement, teleportation and superposition, a quantum computer could, in principle, outperform a classical computer in certain computational tasks. Entanglement allows particles to have a much closer relationship than is possible in classical physics. For example, two photons can be entangled such that if one is horizontally polarized, the other is always vertically polarized, and vice versa, no matter how far apart they are. In quantum teleportation, complete information about the quantum state of a particle is instantaneously transferred by the sender, who is usually called Alice, to a receiver called Bob. Quantum superposition, meanwhile, allows a particle to be in two or more quantum states at the same time


    So let me begin first by saying that given this process we can connect this world line across the expanse of space, is, more or less the understanding that this is to be the means in which these new forms of communication in science are leading as we expound the future, and what it shall become in our present moments.



    See: Central theme is the Sun You can "click" on picture as well, or, use mouse to hover over image, for additional reading

    So you look at the sun, and what new ways can we can perceive and accumulating the data of what connects this "distance and time," one will be all the smarter when they realize that the results of experimental verifications are at present being given, and as such, what shall these examples serve, but to remind one that new experiences continue to bring new innovations to the forefront.

    Lightcone Projection- see mathematical basis here for the introduction of what will become the basis of determinations, the "decomposable definition" of these new forms of communication.

    The basis for these thoughts are the developing views based on the light cone. It was not my reasons alone in which such an idea was used to support an conjecture about, so, by these very reasons I thought it best to explain what such simultaneity can do as we hold these views about "distance and time" as we follow this world line across the expanse of the universe.

    The grey ellipse is moving relativistic sphere, its oblate shape due to Lorentz contraction. Colored ellipse is visual image of the sphere. Background curves are a xy-coordinates grid which is rigidly linked to the sphere. It is shown only at one moment in time.See here for reference and animations.

    Okay, so we have this event that happens in time. How are we to measure what the sun is suppose to be, if we did not have some information about the depth of perception that is needed in order to create this image for consumption?

    Such comparative views are needed that are current, and, "in experimental stages" to help us discern what it means for "Galactic Communication" which we will employ as we measure the distance of this world line.:)

    Such distances "can be elevated in my view," and such instantaneous recognitions are to be the associative values I place on how we can now see the "bulk perspective" and the graviton's condensation we can now assign to the cosmos?

    As we know from Einstein’s theory of special relativity, nothing can travel faster than c, the velocity of light in a vacuum. The speed of the light that we see generally travels with a slower velocity c/n where n is the refractive index of the medium through which we view the light (in air at sea level, n is approximately 1.00029 whereas in water n is 1.33). Highly energetic, charged particles (which are only constrained to travel slower than c) tend to radiate photons when they pass through a medium and, consequently, can suddenly find themselves in the embarrassing position of actually travelling faster than the light they produce!

    The result of this can be illustrated by considering a moving particle which emits pulses of light that expand like ripples on a pond, as shown in the Figure (right). By the time the particle is at the position indicated by the purple spot, the spherical shell of light emitted when the particle was in the blue position will have expanded to the radius indicated by the open blue circle. Likewise, the light emitted when the particle was in the green position will have expanded to the radius indicated by the open green circle, and so on. Notice that these ripples overlap with each other to form an enhanced cone of light indicated by the dotted lines. This is analogous to the idea that leads to a sonic boom when planes such as Concorde travel faster than the speed of sound in air
    See:What is Cerenkov Radiation?

    It is thusly, that such events in time produce information for us, that help us to look at the universe in new ways, and as such, information can be used to build new devices that penetrate beyond the confines we finds photons experience in their limitations.( please Phil take note of, in bold)

    Monday, May 21, 2012

    Digital Physics

    In physics and cosmology, digital physics is a collection of theoretical perspectives based on the premise that the universe is, at heart, describable by information, and is therefore computable. Therefore, the universe can be conceived as either the output of a computer program or as a vast, digital computation device (or, at least, mathematically isomorphic to such a device).

    Digital physics is grounded in one or more of the following hypotheses; listed in order of increasing strength. The universe, or reality:

     

    Contents

     

     

    History

     

    Every computer must be compatible with the principles of information theory, statistical thermodynamics, and quantum mechanics. A fundamental link among these fields was proposed by Edwin Jaynes in two seminal 1957 papers.[1] Moreover, Jaynes elaborated an interpretation of probability theory as generalized Aristotelian logic, a view very convenient for linking fundamental physics with digital computers, because these are designed to implement the operations of classical logic and, equivalently, of Boolean algebra.[2]
    The hypothesis that the universe is a digital computer was pioneered by Konrad Zuse in his book Rechnender Raum (translated into English as Calculating Space). The term digital physics was first employed by Edward Fredkin, who later came to prefer the term digital philosophy.[3] Others who have modeled the universe as a giant computer include Stephen Wolfram,[4] Juergen Schmidhuber,[5] and Nobel laureate Gerard 't Hooft.[6] These authors hold that the apparently probabilistic nature of quantum physics is not necessarily incompatible with the notion of computability. Quantum versions of digital physics have recently been proposed by Seth Lloyd,[7] David Deutsch, and Paola Zizzi.[8]

    Related ideas include Carl Friedrich von Weizsäcker's binary theory of ur-alternatives, pancomputationalism, computational universe theory, John Archibald Wheeler's "It from bit", and Max Tegmark's ultimate ensemble.

    Digital physics

     

    Overview

     

    Digital physics suggests that there exists, at least in principle, a program for a universal computer which computes the evolution of the universe. The computer could be, for example, a huge cellular automaton (Zuse 1967[9]), or a universal Turing machine, as suggested by Schmidhuber (1997), who pointed out that there exists a very short program that can compute all possible computable universes in an asymptotically optimal way.

    Some try to identify single physical particles with simple bits. For example, if one particle, such as an electron, is switching from one quantum state to another, it may be the same as if a bit is changed from one value (0, say) to the other (1). A single bit suffices to describe a single quantum switch of a given particle. As the universe appears to be composed of elementary particles whose behavior can be completely described by the quantum switches they undergo, that implies that the universe as a whole can be described by bits. Every state is information, and every change of state is a change in information (requiring the manipulation of one or more bits). Setting aside dark matter and dark energy, which are poorly understood at present, the known universe consists of about 1080 protons and the same number of electrons. Hence, the universe could be simulated by a computer capable of storing and manipulating about 1090 bits. If such a simulation is indeed the case, then hypercomputation would be impossible.

    Loop quantum gravity could lend support to digital physics, in that it assumes space-time is quantized. Paola Zizzi has formulated a realization of this concept in what has come to be called "computational loop quantum gravity", or CLQG.[10][11] Other theories that combine aspects of digital physics with loop quantum gravity are those of Marzuoli and Rasetti[12][13] and Girelli and Livine.[14]

    Weizsäcker's ur-alternatives

     

    Physicist Carl Friedrich von Weizsäcker's theory of ur-alternatives (archetypal objects), first publicized in his book The Unity of Nature (1980),[15] further developed through the 1990s,[16][17] is a kind of digital physics as it axiomatically constructs quantum physics from the distinction between empirically observable, binary alternatives. Weizsäcker used his theory to derive the 3-dimensionality of space and to estimate the entropy of a proton falling into a black hole.

    Pancomputationalism or the computational universe theory

     

    Pancomputationalism (also known as pan-computationalism, naturalist computationalism) is a view that the universe is a huge computational machine, or rather a network of computational processes which, following fundamental physical laws, computes (dynamically develops) its own next state from the current one.[18]
    A computational universe is proposed by Jürgen Schmidhuber in a paper based on Konrad Zuse's assumption (1967) that the history of the universe is computable. He pointed out that the simplest explanation of the universe would be a very simple Turing machine programmed to systematically execute all possible programs computing all possible histories for all types of computable physical laws. He also pointed out that there is an optimally efficient way of computing all computable universes based on Leonid Levin's universal search algorithm (1973). In 2000 he expanded this work by combining Ray Solomonoff's theory of inductive inference with the assumption that quickly computable universes are more likely than others. This work on digital physics also led to limit-computable generalizations of algorithmic information or Kolmogorov complexity and the concept of Super Omegas, which are limit-computable numbers that are even more random (in a certain sense) than Gregory Chaitin's number of wisdom Omega.

    Wheeler's "it from bit"

     

    Following Jaynes and Weizsäcker, the physicist John Archibald Wheeler wrote the following:

    [...] it is not unreasonable to imagine that information sits at the core of physics, just as it sits at the core of a computer. (John Archibald Wheeler 1998: 340)

    It from bit. Otherwise put, every 'it'—every particle, every field of force, even the space-time continuum itself—derives its function, its meaning, its very existence entirely—even if in some contexts indirectly—from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. 'It from bit' symbolizes the idea that every item of the physical world has at bottom—a very deep bottom, in most instances—an immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yes–no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and that this is a participatory universe. (John Archibald Wheeler 1990: 5)

    David Chalmers of the Australian National University summarised Wheeler's views as follows:

    Wheeler (1990) has suggested that information is fundamental to the physics of the universe. According to this 'it from bit' doctrine, the laws of physics can be cast in terms of information, postulating different states that give rise to different effects without actually saying what those states are. It is only their position in an information space that counts. If so, then information is a natural candidate to also play a role in a fundamental theory of consciousness. We are led to a conception of the world on which information is truly fundamental, and on which it has two basic aspects, corresponding to the physical and the phenomenal features of the world.[19]

    Chris Langan also builds upon Wheeler's views in his epistemological metatheory:

    The Future of Reality Theory According to John Wheeler: In 1979, the celebrated physicist John Wheeler, having coined the phrase “black hole”, put it to good philosophical use in the title of an exploratory paper, Beyond the Black Hole, in which he describes the universe as a self-excited circuit. The paper includes an illustration in which one side of an uppercase U, ostensibly standing for Universe, is endowed with a large and rather intelligent-looking eye intently regarding the other side, which it ostensibly acquires through observation as sensory information. By dint of placement, the eye stands for the sensory or cognitive aspect of reality, perhaps even a human spectator within the universe, while the eye’s perceptual target represents the informational aspect of reality. By virtue of these complementary aspects, it seems that the universe can in some sense, but not necessarily that of common usage, be described as “conscious” and “introspective”…perhaps even “infocognitive”.[20]

    The first formal presentation of the idea that information might be the fundamental quantity at the core of physics seems to be due to Frederick W. Kantor (a physicist from Columbia University). Kantor's book Information Mechanics (Wiley-Interscience, 1977) developed this idea in detail, but without mathematical rigor.

    The toughest nut to crack in Wheeler's research program of a digital dissolution of physical being in a unified physics, Wheeler himself says, is time. In a 1986 eulogy to the mathematician, Hermann Weyl, he proclaimed: "Time, among all concepts in the world of physics, puts up the greatest resistance to being dethroned from ideal continuum to the world of the discrete, of information, of bits. ... Of all obstacles to a thoroughly penetrating account of existence, none looms up more dismayingly than 'time.' Explain time? Not without explaining existence. Explain existence? Not without explaining time. To uncover the deep and hidden connection between time and existence ... is a task for the future."[21] The Australian phenomenologist, Michael Eldred, comments:

    The antinomy of the continuum, time, in connection with the question of being ... is said by Wheeler to be a cause for dismay which challenges future quantum physics, fired as it is by a will to power over moving reality, to "achieve four victories" (ibid.)... And so we return to the challenge to "[u]nderstand the quantum as based on an utterly simple and—when we see it—completely obvious idea" (ibid.) from which the continuum of time could be derived. Only thus could the will to mathematically calculable power over the dynamics, i.e. the movement in time, of beings as a whole be satisfied.[22][23]

    Digital vs. informational physics

     

    Not every informational approach to physics (or ontology) is necessarily digital. According to Luciano Floridi,[24] "informational structural realism" is a variant of structural realism that supports an ontological commitment to a world consisting of the totality of informational objects dynamically interacting with each other. Such informational objects are to be understood as constraining affordances.

    Digital ontology and pancomputationalism are also independent positions. In particular, John Wheeler advocated the former but was silent about the latter; see the quote in the preceding section.
    On the other hand, pancomputationalists like Lloyd (2006), who models the universe as a quantum computer, can still maintain an analogue or hybrid ontology; and informational ontologists like Sayre and Floridi embrace neither a digital ontology nor a pancomputationalist position.[25]

    Computational foundations

     

    Turing machines

     

    Theoretical computer science is founded on the Turing machine, an imaginary computing machine first described by Alan Turing in 1936. While mechanically simple, the Church-Turing thesis implies that a Turing machine can solve any "reasonable" problem. (In theoretical computer science, a problem is considered "solvable" if it can be solved in principle, namely in finite time, which is not necessarily a finite time that is of any value to humans.) A Turing machine therefore sets the practical "upper bound" on computational power, apart from the possibilities afforded by hypothetical hypercomputers.

    Wolfram's principle of computational equivalence powerfully motivates the digital approach. This principle, if correct, means that everything can be computed by one essentially simple machine, the realization of a cellular automaton. This is one way of fulfilling a traditional goal of physics: finding simple laws and mechanisms for all of nature.

    Digital physics is falsifiable in that a less powerful class of computers cannot simulate a more powerful class. Therefore, if our universe is a gigantic simulation, that simulation is being run on a computer at least as powerful as a Turing machine. If humans succeed in building a hypercomputer, then a Turing machine cannot have the power required to simulate the universe.

    The Church–Turing (Deutsch) thesis

     

    The classic Church–Turing thesis claims that any computer as powerful as a Turing machine can, in principle, calculate anything that a human can calculate, given enough time. A stronger version, not attributable to Church or Turing,[26] claims that a universal Turing machine can compute anything any other Turing machine can compute - that it is a generalizable Turing machine. But the limits of practical computation are set by physics, not by theoretical computer science:

    "Turing did not show that his machines can solve any problem that can be solved 'by instructions, explicitly stated rules, or procedures', nor did he prove that the universal Turing machine 'can compute any function that any computer, with any architecture, can compute'. He proved that his universal machine can compute any function that any Turing machine can compute; and he put forward, and advanced philosophical arguments in support of, the thesis here called Turing's thesis. But a thesis concerning the extent of effective methods—which is to say, concerning the extent of procedures of a certain sort that a human being unaided by machinery is capable of carrying out—carries no implication concerning the extent of the procedures that machines are capable of carrying out, even machines acting in accordance with 'explicitly stated rules.' For among a machine's repertoire of atomic operations there may be those that no human being unaided by machinery can perform." [27]

    On the other hand, if two further conjectures are made, along the lines that:

    • hypercomputation always involves actual infinities;
    • there are no actual infinities in physics,

    the resulting compound principle does bring practical computation within Turing's limits.
    As David Deutsch puts it:

    "I can now state the physical version of the Church-Turing principle: 'Every finitely realizable physical system can be perfectly simulated by a universal model computing machine operating by finite means.' This formulation is both better defined and more physical than Turing's own way of expressing it."[28] (Emphasis added)

    This compound conjecture is sometimes called the "strong Church-Turing thesis" or the Church–Turing–Deutsch principle.

    Criticism

     

    The critics of digital physics—including physicists[citation needed] who work in quantum mechanics—object to it on several grounds.

    Physical symmetries are continuous

     

    One objection is that extant models of digital physics are incompatible[citation needed] with the existence of several continuous characters of physical symmetries, e.g., rotational symmetry, translational symmetry, Lorentz symmetry, and electroweak symmetry, all central to current physical theory.

    Proponents of digital physics claim that such continuous symmetries are only convenient (and very good) approximations of a discrete reality. For example, the reasoning leading to systems of natural units and the conclusion that the Planck length is a minimum meaningful unit of distance suggests that at some level space itself is quantized.[29]

    Locality

     

    Some argue[citation needed] that extant models of digital physics violate various postulates of quantum physics. For example, if these models are not grounded in Hilbert spaces and probabilities, they belong to the class of theories with local hidden variables that some deem ruled out experimentally using Bell's theorem. This criticism has two possible answers. First, any notion of locality in the digital model does not necessarily have to correspond to locality formulated in the usual way in the emergent spacetime. A concrete example of this case was recently given by Lee Smolin.[30] Another possibility is a well-known loophole in Bell's theorem known as superdeterminism (sometimes referred to as predeterminism).[31] In a completely deterministic model, the experimenter's decision to measure certain components of the spins is predetermined. Thus, the assumption that the experimenter could have decided to measure different components of the spins than he actually did is, strictly speaking, not true.

    Physical theory requires the continuum

     

    It has been argued[weasel words] that digital physics, grounded in the theory of finite state machines and hence discrete mathematics, cannot do justice to a physical theory whose mathematics requires the real numbers, which is the case for all physical theories having any credibility.

    But computers can manipulate and solve formulas describing real numbers using symbolic computation, thus avoiding the need to approximate real numbers by using an infinite number of digits.

    Before symbolic computation, a number—in particular a real number, one with an infinite number of digits—was said to be computable if a Turing machine will continue to spit out digits endlessly. In other words, there is no "last digit". But this sits uncomfortably with any proposal that the universe is the output of a virtual-reality exercise carried out in real time (or any plausible kind of time). Known physical laws (including quantum mechanics and its continuous spectra) are very much infused with real numbers and the mathematics of the continuum.

    "So ordinary computational descriptions do not have a cardinality of states and state space trajectories that is sufficient for them to map onto ordinary mathematical descriptions of natural systems. Thus, from the point of view of strict mathematical description, the thesis that everything is a computing system in this second sense cannot be supported".[32]

    For his part, David Deutsch generally takes a "multiverse" view to the question of continuous vs. discrete. In short, he thinks that “within each universe all observable quantities are discrete, but the multiverse as a whole is a continuum. When the equations of quantum theory describe a continuous but not-directly-observable transition between two values of a discrete quantity, what they are telling us is that the transition does not take place entirely within one universe. So perhaps the price of continuous motion is not an infinity of consecutive actions, but an infinity of concurrent actions taking place across the multiverse.” January, 2001 The Discrete and the Continuous, an abridged version of which appeared in The Times Higher Education Supplement.

    See also

     

     

    References

     

    1. ^ Jaynes, E. T., 1957, "Information Theory and Statistical Mechanics," Phys. Rev 106: 620.
      Jaynes, E. T., 1957, "Information Theory and Statistical Mechanics II," Phys. Rev. 108: 171.
    2. ^ Jaynes, E. T., 1990, "Probability Theory as Logic," in Fougere, P.F., ed., Maximum-Entropy and Bayesian Methods. Boston: Kluwer.
    3. ^ See Fredkin's Digital Philosophy web site.
    4. ^ A New Kind of Science website. Reviews of ANKS.
    5. ^ Schmidhuber, J., "Computer Universes and an Algorithmic Theory of Everything."
    6. ^ G. 't Hooft, 1999, "Quantum Gravity as a Dissipative Deterministic System," Class. Quant. Grav. 16: 3263-79.
    7. ^ Lloyd, S., "The Computational Universe: Quantum gravity from quantum computation."
    8. ^ Zizzi, Paola, "Spacetime at the Planck Scale: The Quantum Computer View."
    9. ^ Zuse, Konrad, 1967, Elektronische Datenverarbeitung vol 8., pages 336-344
    10. ^ Zizzi, Paola, "A Minimal Model for Quantum Gravity."
    11. ^ Zizzi, Paola, "Computability at the Planck Scale."
    12. ^ Marzuoli, A. and Rasetti, M., 2002, "Spin Network Quantum Simulator," Phys. Lett. A306, 79-87.
    13. ^ Marzuoli, A. and Rasetti, M., 2005, "Computing Spin Networks," Annals of Physics 318: 345-407.
    14. ^ Girelli, F.; Livine, E. R., 2005, "[1]" Class. Quant. Grav. 22: 3295-3314.
    15. ^ von Weizsäcker, Carl Friedrich (1980). The Unity of Nature. New York: Farrar, Straus, and Giroux.
    16. ^ von Weizsäcker, Carl Friedrich (1985) (in German). Aufbau der Physik [The Structure of Physics]. Munich. ISBN 3-446-14142-1.
    17. ^ von Weizsäcker, Carl Friedrich (1992) (in German). Zeit und Wissen.
    18. ^ Papers on pancompuationalism
    19. ^ Chalmers, David. J., 1995, "Facing up to the Hard Problem of Consciousness," Journal of Consciousness Studies 2(3): 200-19. This paper cites John A. Wheeler, 1990, "Information, physics, quantum: The search for links" in W. Zurek (ed.) Complexity, Entropy, and the Physics of Information. Redwood City, CA: Addison-Wesley. Also see Chalmers, D., 1996. The Conscious Mind. Oxford Univ. Press.
    20. ^ Langan, Christopher M., 2002, "The Cognitive-Theoretic Model of the Universe: A New Kind of Reality Theory, pg. 7" Progress in Complexity, Information and Design
    21. ^ Wheeler, John Archibald, 1986, "Hermann Weyl and the Unity of Knowledge"
    22. ^ Eldred, Michael, 2009, 'Postscript 2: On quantum physics' assault on time'
    23. ^ Eldred, Michael, 2009, The Digital Cast of Being: Metaphysics, Mathematics, Cartesianism, Cybernetics, Capitalism, Communication ontos, Frankfurt 2009 137 pp. ISBN 978-3-86838-045-3
    24. ^ Floridi, L., 2004, "Informational Realism," in Weckert, J., and Al-Saggaf, Y, eds., Computing and Philosophy Conference, vol. 37."
    25. ^ See Floridi talk on Informational Nature of Reality, abstract at the E-CAP conference 2006.
    26. ^ B. Jack Copeland, Computation in Luciano Floridi (ed.), The Blackwell guide to the philosophy of computing and information, Wiley-Blackwell, 2004, ISBN 0-631-22919-1, pp. 10-15
    27. ^ Stanford Encyclopedia of Philosophy: "The Church-Turing thesis" -- by B. Jack Copeland.
    28. ^ David Deutsch, "Quantum Theory, the Church-Turing Principle and the Universal Quantum Computer."
    29. ^ John A. Wheeler, 1990, "Information, physics, quantum: The search for links" in W. Zurek (ed.) Complexity, Entropy, and the Physics of Information. Redwood City, CA: Addison-Wesley.
    30. ^ L. Smolin, "Matrix models as non-local hidden variables theories."
    31. ^ J. S. Bell, 1981, "Bertlmann's socks and the nature of reality," Journal de Physique 42 C2: 41-61.
    32. ^ Piccinini, Gualtiero, 2007, "Computational Modelling vs. Computational Explanation: Is Everything a Turing Machine, and Does It Matter to the Philosophy of Mind?" Australasian Journal of Philosophy 85(1): 93-115.

     

    Further reading

     

     

    External links