Monday, March 22, 2010

The Law of Octaves


Dmitri Ivanovich Mendeleev (also romanized Mendeleyev or Mendeleef; Russian: Дми́трий Ива́нович Менделе́ев About this sound listen ) (8 February [O.S. 27 January] 1834 – 2 February [O.S. 20 January] 1907), was a Russian chemist and inventor. He is credited as being the creator of the first version of the periodic table of elements. Using the table, he predicted the properties of elements yet to be discovered.

This post was inspired by the "Poll: Do you believe in extraterrestrial life?"

I was thinking about Hoyle and CNO and of course about Lee Smolin. As you get older it is harder for me to retain all that proceeded each discussion, so linking helps to refresh.

Carbon is the 15th most abundant element in the Earth's crust, and the fourth most abundant element in the universe by mass after hydrogen, helium, and oxygen. It is present in all known lifeforms, and in the human body carbon is the second most abundant element by mass (about 18.5%) after oxygen.[14] This abundance, together with the unique diversity of organic compounds and their unusual polymer-forming ability at the temperatures commonly encountered on Earth, make this element the chemical basis of all known life.See:Carbon

It was necessary to recall the links from one to the other, to show how one's perception "about Carbon was drawn" into the discussion about what life in the cosmos is based on.



I am partial to the allotropic expression on a physical scale by something that was understood as the Law of Octaves.

Carbon forms the backbone of biology for all life on Earth. Complex molecules are made up of carbon bonded with other elements, especially oxygen, hydrogen and nitrogen. It is these elements that living organisms need, among others, and carbon is able to bond with all of these because of its four valence electrons. Since no life has been observed that is not carbon-based, it is sometimes assumed in astrobiology that life elsewhere in the universe will also be carbon-based. This assumption is referred to by critics as carbon chauvinism, as it may be possible for life to form that is not based on carbon, even though it has never been observed.See:Carbon-based life


For sure each round of discussion on the topic leads too... and there are ideas with which one can ask.  I am in no way advocating anything here in terms of AP other then to remember discussions about this very poll question before.  

The triple alpha process is highly dependent on carbon-12 having a resonance with the same energy as helium-4 and beryllium-8 and before 1952 no such energy level was known. It was astrophysicist Fred Hoyle who used the fact that carbon-12 is so abundant in the universe (and that our existence depends upon it - the Anthropic Principle), as evidence for the existence of the carbon-12 resonance. Fred suggested the idea to nuclear physicist Willy Fowler, who conceded that it was possible that this energy level had been missed in previous work on carbon-12. After a brief undertaking by his research group, they discovered a resonance near to 7.65 Mev.See:Triple-alpha process

*** 
 

Mass spectrometers are analytical instruments that determine atomic and molecular masses with great accuracy. Low-pressure vapors of elements or molecules are hit by a beam of rapidly moving electrons. The collision knocks an electron off the sample atom or molecule, leaving it positively charged.
These newly-formed ions are accelerated out of the ionization chamber by an electric field. The speeds to which the ions can be accelerated by the electric field are determined by their masses. Lighter ions can go faster than heavier ones.


Ion's path bent by external magnetic field
Ion's path bent by external magnetic field
Courtesy: McREL
The beam of positively-charged ions generates a slight magnetic field that interacts with an externally-applied magnetic field. The net result is that the trajectory of a charged particle is curved to an extent that depends on its speed (determined by its mass). When the beam of a mixture of isotopes of different masses falls on a photographic plate, the different isotopes converge at different points, corresponding to the different radii of their semicircular paths.
The mathematical equation that describes this phenomenon is: m/e = H2 r2 /2V, where m is the mass of the ion, e is the charge of the ion, H is the magnetic field strength, r is the radius of the semicircle, and V is the accelerating potential.
***
See Also:

Friday, March 19, 2010

Neutrinoless Double Beta Decay

You don’t see what you’re seeing until you see it,” Dr. Thurston said, “but when you do see it, it lets you see many other things.Elusive Proof, Elusive Prover: A New Mathematical Mystery


The Enriched Xenon Observatory is an experiment in particle physics aiming to detect "neutrino-less double beta decay" using large amounts of xenon isotopically enriched in the isotope 136. A 200-kg detector using liquid Xe is currently being installed at the Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico. Many research and development efforts are underway for a ton-scale experiment, with the goal of probing new physics and the mass of the neutrino. The Enriched Xenon Observatory

***
Feynman diagram of neutrinoless double-beta decay, with two neutrons decaying to two protons. The only emitted products in this process are two electrons, which can only occur if the neutrino and antineutrino are the same particle (i.e. Majorana neutrinos) so the same neutrino can be emitted and absorbed within the nucleus. In conventional double-beta decay, two antineutrinos - one arising from each W vertex - are emitted from the nucleus, in addition to the two electrons. The detection of neutrinoless double-beta decay is thus a sensitive test of whether neutrinos are Majorana particles.


Neutrinoless double-beta decay experiments

Numerous experiments have been carried out to search for neutrinoless double-beta decay. Some recent and proposed future experiments include:

See:Direct Dark Matter Detection


 See Also: South Dakota's LUX will join the dark matter wars

Tuesday, March 16, 2010

Frogs, Foam and Fuel: UC Researchers Convert Solar Energy to Sugars



Photos By: Illustration by Megan Gundrum, fifth-year DAAP student

For decades, farmers have been trying to find ways to get more energy out of the sun.

In natural photosynthesis, plants take in solar energy and carbon dioxide and then convert it to oxygen and sugars. The oxygen is released to the air and the sugars are dispersed throughout the plant — like that sweet corn we look for in the summer. Unfortunately, the allocation of light energy into products we use is not as efficient as we would like. Now engineering researchers at the University of Cincinnati are doing something about that.
See:Frogs, Foam and Fuel: UC Researchers Convert Solar Energy to Sugars

***

I get very excited when I see ideas like this.


See:When It Comes to Photosynthesis, Plants Perform Quantum Computation
Plants soak up some of the 1017 joules of solar energy that bathe Earth each second, harvesting as much as 95 percent of it from the light they absorb. The transformation of sunlight into carbohydrates takes place in one million billionths of a second, preventing much of that energy from dissipating as heat. But exactly how plants manage this nearly instantaneous trick has remained elusive. Now biophysicists at the University of California, Berkeley, have shown that plants use the basic principle of quantum computing—the exploration of a multiplicity of different answers at the same time—to achieve near-perfect efficiency.

Biophysicist Gregory Engel and his colleagues cooled a green sulfur bacterium—Chlorobium tepidum, one of the oldest photosynthesizers on the planet—to 77 kelvins [–321 degrees Fahrenheit] and then pulsed it with extremely short bursts of laser light. By manipulating these pulses, the researchers could track the flow of energy through the bacterium's photosynthetic system. "We always thought of it as hopping through the system, the same way that you or I might run through a maze of bushes," Engel explains. "But, instead of coming to an intersection and going left or right, it can actually go in both directions at once and explore many different paths most efficiently."

In other words, plants are employing the basic principles of quantum mechanics to transfer energy from chromophore (photosynthetic molecule) to chromophore until it reaches the so-called reaction center where photosynthesis, as it is classically defined, takes place. The particles of energy are behaving like waves. "We see very strong evidence for a wavelike motion of energy through these photosynthetic complexes," Engel says. The results appear in the current issue of Nature.

QUANTUM CHLOROPHYLL: Sunlight triggers wave-like motion in green chlorophyll, embedded in a protein structure, depicted in gray here, that guides its function. GREGORY ENGEL

Employing this process allows the near-perfect efficiency of plants in harvesting energy from sunlight and is likely to be used by all of them, Engel says. It might also be copied usefully by researchers attempting to create artificial photosynthesis, such as that in photovoltaic cells for generating electricity. "This can be a much more efficient energy transfer than a classical hopping one," Engel says. "Exactly how to implement that is a very difficult question."

It also remains unclear exactly how a plant's structure permits this quantum effect to take place. "[The protein structure] of the plant has to be tuned to allow transfer among chromophores but not to allow transfers into [heat]," Engel says. "How that tuning works and how it is controlled, we don't know." Inside every spring leaf is a system capable of performing a speedy and efficient quantum computation, and therein lies the key to much of the energy on Earth.

***

Wednesday, March 03, 2010

Neutron interferometer

Lubos Motl:
You have completely misunderstood the neutron gravitational interference experiment. They showed that the force acting on the neutron is simply not negligible. Quite on the contrary, these interference experiments could measure and did measure the gravitational acceleration - and even the tidal forces - on the phase shift of the neutron's wave function. It's the very point of these experiments.

So whatever theory predicts that such forces are "negligible" is instantly falsified.
***
From Wikipedia
In physics, a neutron interferometer is an interferometer capable of diffracting neutrons, allowing the wave-like nature of neutrons, and other related phenomena, to be explored.
Interferometry inherently depends on the wave nature of the object. As pointed out by de Broglie in his PhD-thesis, particles, including neutrons, can behave like waves (the so called wave-particle duality, now explained in the general framework of quantum mechanics). The wave functions of the individual interferometer paths are created and recombined coherently which needs the application of dynamical theory of diffraction. Neutron interferometers are the counterpart of X-ray interferometers and are used to study quantities or benefits related to thermal neutron radiation.

Neutron interferometers are used to determine minute quantum-mechanical effects to the neutron wave, such as studies of the
  • Aharonov-Bohm effect
  • gravity acting on an elementary particle, the neutron
  • rotation of the earth acting on a quantum system
they can be applied for
Like X-ray interferometers, neutron interferometers are typically carved from a single large crystal of silicon, often 10 to 30 or more centimeters in diameter and 20 to 60 or more in length. Modern semiconductor technology allows large single-crystal silicon boules to be easily grown. Since the boule is a single crystal, the atoms in the boule are precisely aligned, to within small fractions of a nanometer or an angstrom, over the entire boule. The interferometer is created by carving away all but three slices of silicon, held in perfect alignment by a base. (image) Neutrons impinge on the first slice, where, by diffraction from the crystalline lattice, they separate into two beams. At the second slice, they are diffracted again, with two beams continuing on to the third slice. At the third slice, the beams recombine, interfering constructively or destructively, completing the interferometer. Without the precise, angstrom-level alignment of the three slices, the interference results would not be meaningful.

Only recently, a neutron interferometer for cold and ultracold neutrons was designed and successfully run. As neutron optical components in this case three artificial holographically produced, i.e., by means of a light optic two wave interference setup illuminating a photo-neutronrefractive polymer, gratings are employed.

References

V. F. Sears, Neutron Optics, Oxford University Press (1998).
H. Rauch and S. A. Werner, Neutron Interferometry, Clarendon Press, Oxford (2000).

***

On the Origin of Gravity and the Laws of Newton

Starting from first principles and general assumptions Newton's law of gravitation is shown to arise naturally and unavoidably in a theory in which space is emergent through a holographic scenario. Gravity is explained as an entropic force caused by changes in the information associated with the positions of material bodies. A relativistic generalization of the presented arguments directly leads to the Einstein equations. When space is emergent even Newton's law of inertia needs to be explained. The equivalence principle leads us to conclude that it is actually this law of inertia whose origin is entropic.

***


 


The Neutron Interferometry and Optics Facility (NIOF) located in the NIST Center for Neutron Research Guide Hall is one of the world's premier user facilities for neutron interferometry and related neutron optical measurements. A neutron interferometer (NI) splits, then recombines neutron waves. This gives the NI its unique ability to experimentally access the phase of neutron waves. Phase measurements are used to study the magnetic, nuclear, and structural properties of materials, as well fundamental questions in quantum physics. Related, innovative neutron optical techniques for use in condensed matter and materials science research are being developed.
 ***
 


Neutron Interferometer. A three blade neutron interferometer, machined from a single crystal silicon ingot is shown in two views. A monoenergetic neutron beam is split by the first blade and recombined in the third blade. If a sample is introduced in one of the paths, a phase difference in the wave function is produced, and interference between the recombined beams causes count rate shifts of opposite sign in the two detectors.
The Neutron Interferometer Facility in the Cold Neutron Guide Hall became operational in April 1994. It became available as a National User Facility in September 1996. Phase contrast of up to 88 percent and phase stability of better than five milliradians per day were observed. These performance indications are primarily the result of the advanced vibration isolation and environmental control systems. The interferometer operates inside a double walled enclosure, with the inner room built on a 40,000 kg slab which floats on pneumatic pads above an isolated foundation.



See AlsoGravity is Entropy is Gravity is...

Sunday, February 28, 2010

Trivium:Three Roads


 
Logic is the art of thinking; grammar, the art of inventing symbols and combining them to express thought; and rhetoric, the art of communicating thought from one mind to another, the adaptation of language to circumstance.Sister Miriam Joseph

 Painting by Cesare Maccari (1840-1919), Cicero Denounces Catiline.

In medieval universities, the trivium comprised the three subjects taught first: grammar, logic, and rhetoric. The word is a Latin term meaning “the three ways” or “the three roads” forming the foundation of a medieval liberal arts education. This study was preparatory for the quadrivium. The trivium is implicit in the De nuptiis of Martianus Capella, although the term was not used until the Carolingian era when it was coined in imitation of the earlier quadrivium.[1] It was later systematized in part by Petrus Ramus as an essential part of Ramism.


Formal grammar

A formal grammar (sometimes simply called a grammar) is a set of rules of a specific kind, for forming strings in a formal language. The rules describe how to form strings from the language's alphabet that are valid according to the language's syntax. A grammar does not describe the meaning of the strings or what can be done with them in whatever context —only their form.

Formal language theory, the discipline which studies formal grammars and languages, is a branch of applied mathematics. Its applications are found in theoretical computer science, theoretical linguistics, formal semantics, mathematical logic, and other areas.

A formal grammar is a set of rules for rewriting strings, along with a "start symbol" from which rewriting must start. Therefore, a grammar is usually thought of as a language generator. However, it can also sometimes be used as the basis for a "recognizer"—a function in computing that determines whether a given string belongs to the language or is grammatically incorrect. To describe such recognizers, formal language theory uses separate formalisms, known as automata theory. One of the interesting results of automata theory is that it is not possible to design a recognizer for certain formal languages.

Parsing is the process of recognizing an utterance (a string in natural languages) by breaking it down to a set of symbols and analyzing each one against the grammar of the language. Most languages have the meanings of their utterances structured according to their syntax—a practice known as compositional semantics. As a result, the first step to describing the meaning of an utterance in language is to break it down part by part and look at its analyzed form (known as its parse tree in computer science, and as its deep structure in generative grammar).
Logic

As a discipline, logic dates back to Aristotle, who established its fundamental place in philosophy. The study of logic is part of the classical trivium.

Averroes defined logic as "the tool for distinguishing between the true and the false"[4]; Richard Whately, '"the Science, as well as the Art, of reasoning"; and Frege, "the science of the most general laws of truth". The article Definitions of logic provides citations for these and other definitions.

Logic is often divided into two parts, inductive reasoning and deductive reasoning. The first is drawing general conclusions from specific examples, the second drawing logical conclusions from definitions and axioms. A similar dichotomy, used by Aristotle, is analysis and synthesis. Here the first takes an object of study and examines its component parts, the second considers how parts can be combined to form a whole.
Logic is also studied in argumentation theory.[5]


Tuesday, February 23, 2010

Calorimetric Equivalence Principle Test

With Stefan shutting down the blog temporary I thought to gather my thoughts here.

Gravitomagnetism

This approximate reformulation of gravitation as described by general relativity makes a "fictitious force" appear in a frame of reference different from a moving, gravitating body. By analogy with electromagnetism, this fictitious force is called the gravitomagnetic force, since it arises in the same way that a moving electric charge creates a magnetic field, the analogous "fictitious force" in special relativity. The main consequence of the gravitomagnetic force, or acceleration, is that a free-falling object near a massive rotating object will itself rotate. This prediction, often loosely referred to as a gravitomagnetic effect, is among the last basic predictions of general relativity yet to be directly tested.
Indirect validations of gravitomagnetic effects have been derived from analyses of relativistic jets. Roger Penrose had proposed a frame dragging mechanism for extracting energy and momentum from rotating black holes.[2] Reva Kay Williams, University of Florida, developed a rigorous proof that validated Penrose's mechanism.[3] Her model showed how the Lense-Thirring effect could account for the observed high energies and luminosities of quasars and active galactic nuclei; the collimated jets about their polar axis; and the asymmetrical jets (relative to the orbital plane).[4] All of those observed properties could be explained in terms of gravitomagnetic effects.[5] Williams’ application of Penrose's mechanism can be applied to black holes of any size.[6] Relativistic jets can serve as the largest and brightest form of validations for gravitomagnetism.
A group at Stanford University is currently analyzing data from the first direct test of GEM, the Gravity Probe B satellite experiment, to see if they are consistent with gravitomagnetism.


A group at Stanford University is currently analyzing data from the first direct test of GEM, the Gravity Probe B satellite experiment, to see if they are consistent with gravitomagnetism.

While I am not as progressed in terms of the organization of your thought process(inexperience in terms of the education) I am holding the ideas of Mendeleev in mind as I look at this topic you've gathered. And Newton as well, but not in the way one might have deferred to as the basis if gravity research.

It is more on the idea of what we can create in reality given all the elements at our disposal. This is also the same idea in mathematics that all the information is there and only has t be discovered. Such a hierarchy in thinking is also the idea of geometrical presence stretched to higher dimensions, as one would point to mater assmptins as t a higher order preset in the development of the material of earth as to the planet.

***

Uncle Al,

Overview:A parity calorimetry test offers a 33,000-fold improvement in EP anomaly sensitivity in only two days of measurements.

we are not so different....that this quest may not be apparent for many, yet it is a simple question about what is contracted to help understand "principals of formation" had been theoretically developed in terms of the genus figures(Stanley Mandelstam) that we understand that this progression mathematically has been slow.

So we scientifically build this experimental progression.

But indeed, it's a method in terms of moving from "the false vacuum to the true?" What is the momentum called toward materialization?

Such an emergent feature while discussing some building block model gives some indication of a "higher order principal" that is not clearly understood, while from a condense matter theorist point of view, this is a emergent feature?

Best,

Bordeaux, France is 44.83 N

http://www.mazepath.com/uncleal/lajos.htm#b7
***

According to general relativity, the gravitational field produced by a rotating object (or any rotating mass-energy) can, in a particular limiting case, be described by equations that have the same form as the magnetic field in classical electromagnetism. Starting from the basic equation of general relativity, the Einstein field equation, and assuming a weak gravitational field or reasonably flat spacetime, the gravitational analogs to Maxwell's equations for electromagnetism, called the "GEM equations", can be derived. GEM equations compared to Maxwell's equations in SI are:[7] [8][9][10]

GEM equations Maxwell's equations
 \nabla \cdot \mathbf{E}_\text{g} = -4 \pi G \rho \  \nabla \cdot \mathbf{E} =  \frac{\rho_\text{em}}{\epsilon_0} \
 \nabla \cdot \mathbf{B}_\text{g} = 0 \  \nabla \cdot \mathbf{B} = 0 \
 \nabla \times \mathbf{E}_\text{g} = -\frac{\partial \mathbf{B}_\text{g} } {\partial t} \  \nabla \times \mathbf{E} = -\frac{\partial \mathbf{B} } {\partial t} \
 \nabla \times \mathbf{B}_\text{g} = -\frac{4 \pi G}{c^2} \mathbf{J} + \frac{1}{c^2} \frac{\partial \mathbf{E}_\text{g}} {\partial t}  \nabla \times \mathbf{B} = \frac{1}{\epsilon_0 c^2} \mathbf{J}_\text{em} + \frac{1}{c^2} \frac{\partial \mathbf{E}} {\partial t}

where:

Monday, February 22, 2010

Physicists Discover How to Entangle at High Temperatures

While I do not just like to echo in the world of information it is important to me to see how we can use entanglement to give us information about quantum gravity. Is it possible?


Entanglement is the weird quantum process in which two objects share the same existence. So a measurement on one object immediately influences the other, not matter how far apart they may be.
Entanglement is a strange and fragile thing. Sneeze and it vanishes. The problem is that entanglement is destroyed by any interaction with the environment and these interactions are hard to prevent. So physicists have only ever been able to study and exploit entanglement in systems that do not interact easily with the environment, such as photons, or at temperatures close to absolute zero where the environment becomes more benign.

In fact, physicists believe that there is a fundamental limit to the thermal energies at which entanglement can be usefully exploited. And this limit is tiny, comparable to very lowest temperatures.
Today, Fernando Galve at the University of the Balearic Islands in Spain and a few buddies, show how this limit can be dramatically increased. The key behind their idea is the notion of a squeezed state.
In quantum mechanics, Heisenberg's uncertainty principle places important limits on how well certain pairs of complementary properties can be observed. For example, the more accurately you measure position, the less well you can determine momentum. The same is true of energy and time and also of the phase and amplitude of a quantum state.

Physicists have learnt how to play around with these complementary observables to optimise the way they make measurements. They've discovered that they can trade their knowledge of one complementary observable for an improvement in the other. See more here:Physicists Discover How to Entangle at High Temperatures

Saturday, February 20, 2010

Economy, as Science


A shift in paradigm can lead, via the theory-dependence of observation, to a difference in one's experiences of things and thus to a change in one's phenomenal world.ON Thomas Kuhn
 
Control the information you control the people?:) Again as heart felt and idealistic you can become in your efforts, it's not enough to cry out in political verbiage because you'll always end up with another person saying that it is only a political perspective. That it is the progressive conservative you don't like and their leader? It's not enough.

So what do you do?

Do you succumb to the frustration that what is moving as a sub-culture working from the inside/out, is the idea that you can build a better consensus from what is moving the fabric of society to know that we can change the outcome as to what Canada shall become as well?

They( a conspirator thought) as a force that is undermining the public perception while society did not grasp the full understanding of what has been done to them. Society having been cast to fighting at the "local level to advance a larger agenda?"

Does it not seem that once you occupy the mind in such close quarter conflagrations that mind has been circumvented from the larger picture?

Pain, and emotional turmoil does this.

Historically once the fire has been started, like some phoenix, a new cultural idealism manifests as to what the individual actually wants when they are in full recognition that "as a force" moved forward in a democratic compunction as a government in waiting to advance the principals by which it can stand as the public mind.


However, the incommensurability thesis is not Kuhn's only positive philosophical thesis. Kuhn himself tells us that “The paradigm as shared example is the central element of what I now take to be the most novel and least understood aspect of [The Structure of Scientific Revolutions]” (1970a, 187). Nonetheless, Kuhn failed to develop the paradigm concept in his later work beyond an early application of its semantic aspects to the explanation of incommensurability. The explanation of scientific development in terms of paradigms was not only novel but radical too, insofar as it gives a naturalistic explanation of belief-change. Naturalism was not in the early 1960s the familiar part of philosophical landscape that it has subsequently become. Kuhn's explanation contrasted with explanations in terms of rules of method (or confirmation, falsification etc.) that most philosophers of science took to be constitutive of rationality. Furthermore, the relevant disciplines (psychology, cognitive science, artificial intelligence) were either insufficiently progressed to support Kuhn's contentions concerning paradigms, or were antithetical to them (in the case of classical AI). Now that naturalism has become an accepted component of philosophy, there has recently been interest in reassessing Kuhn's work in the light of developments in the relevant sciences, many of which provide corroboration for Kuhn's claim that science is driven by relations of perceived similarity and analogy to existing problems and their solutions (Nickles 2003b, Nersessian 2003). It may yet be that a characteristically Kuhnian thesis will play a prominent part in our understanding of science.
I would advance that the word "science" in quote above, be changed to "economy."

What paradigmatic solution has been advanced that such a thing can turn over the present equatorial function assigned to the pubic mind, that we will be in better control of our destinies as Canadians?

Precursor to such changes are revolutions in the thought patterns established as functionary pundits of money orientated societies. They have become "fixed to a particular agenda." Rote systems assumed and brought up in,  extolled as to the highest moral obligation is  to live well, and on the way, fix ourselves to debt written obligations that shall soon over come the sensibility of what it shall take to live?

Force upon them is the understanding that we had become a slave to our reason and a slave to a master disguised as what is healthy and knows no boundaries? A capitalistic dream.

Update:

Money Supply and Energy: Is The Economy Inherently Unstable?

 

Tuesday, February 16, 2010

Article From New York Times and More




Brookhaven National Laboratory

HOT A computer rendition of 4-trillion-degree Celsius quark-gluon plasma created in a demonstration of what scientists suspect shaped cosmic history.

In Brookhaven Collider, Scientists Briefly Break a Law of Nature

The Brookhaven scientists and their colleagues discussed their latest results from RHIC in talks and a news conference at a meeting of the American Physical Society Monday in Washington, and in a pair of papers submitted to Physical Review Letters. “This is a view of what the world was like at 2 microseconds,” said Jack Sandweiss of Yale, a member of the Brookhaven team, calling it, “a seething cauldron.”

Among other things, the group announced it had succeeded in measuring the temperature of the quark-gluon plasma as 4 trillion degrees Celsius, “by far the hottest matter ever made,” Dr. Vigdor said. That is 250,000 times hotter than the center of the Sun and well above the temperature at which theorists calculate that protons and neutrons should melt, but the quark-gluon plasma does not act the way theorists had predicted.

Instead of behaving like a perfect gas, in which every quark goes its own way independent of the others, the plasma seemed to act like a liquid. “It was a very big surprise,” Dr. Vigdor said, when it was discovered in 2005. Since then, however, theorists have revisited their calculations and found that the quark soup can be either a liquid or a gas, depending on the temperature, he explained. “This is not your father’s quark-gluon plasma,” said Barbara V. Jacak, of the State University at Stony Brook, speaking for the team that made the new measurements.

It is now thought that the plasma would have to be a million times more energetic to become a perfect gas. That is beyond the reach of any conceivable laboratory experiment, but the experiments colliding lead nuclei in the Large Hadron Collider outside Geneva next winter should reach energies high enough to see some evolution from a liquid to a gas.
See more at above link.

***

Violating Parity with Quarks and Gluons
by Sean Carroll of Cosmic Variance
This new result from RHIC doesn’t change that state of affairs, but shows how quarks and gluons can violate parity spontaneously if they are in the right environment — namely, a hot plasma with a magnetic field.

So, okay, no new laws of physics. Just a much better understanding of how the existing ones work! Which is most of what science does, after all
.

***

Quark–gluon plasma

From Wikipedia, the free encyclopedia

A QGP is formed at the collision point of two relativistically accelerated gold ions in the center of the STAR detector at the relativistic heavy ion collider at the Brookhaven national laboratory.


A quark-gluon plasma (QGP) or quark soup[1] is a phase of quantum chromodynamics (QCD) which exists at extremely high temperature and/or density. This phase consists of (almost) free quarks and gluons, which are the basic building blocks of matter. Experiments at CERN's Super Proton Synchrotron (SPS) first tried to create the QGP in the 1980s and 1990s: the results led CERN to announce indirect evidence for a "new state of matter"[2] in 2000. Current experiments at Brookhaven National Laboratory's Relativistic Heavy Ion Collider (RHIC) are continuing this effort.[3] Three new experiments running on CERN's Large Hadron Collider (LHC), ALICE,[4] ATLAS and CMS, will continue studying properties of QGP.

Contents

  • 1 General introduction


    • 1.1 Why this is referred to as "plasma"
    • 1.2 How the QGP is studied theoretically
    • 1.3 How it is created in the lab
    • 1.4 How the QGP fits into the general scheme of physics
  • 2 Expected properties


    • 2.1 Thermodynamics
    • 2.2 Flow
    • 2.3 Excitation spectrum
  • 3 Experimental situation
  • 4 Formation of quark matter
  • 5 See also
  • 6 References
  • 7 External links

General introduction

The quark-gluon plasma contains quarks and gluons, just as normal (baryonic) matter does. The difference between these two phases of QCD is that in normal matter each quark either pairs up with an anti-quark to form a meson or joins with two other quarks to form a baryon (such as the proton and the neutron). In the QGP, by contrast, these mesons and baryons lose their identities and dissolve into a fluid of quarks and gluons.[5] In normal matter quarks are confined; in the QGP quarks are deconfined.
Although the experimental high temperatures and densities predicted as producing a quark-gluon plasma have been realized in the laboratory, the resulting matter does not behave as a quasi-ideal state of free quarks and gluons, but, rather, as an almost perfect dense fluid.[6] Actually the fact that the quark-gluon plasma will not yet be "free" at temperatures realized at present accelerators had been predicted already in 1984 [7] as a consequence of the remnant effects of confinement. 

Why this is referred to as "plasma"

A plasma is matter in which charges are screened due to the presence of other mobile charges; for example: Coulomb's Law is modified to yield a distance-dependent charge. In a QGP, the color charge of the quarks and gluons is screened. The QGP has other analogies with a normal plasma. There are also dissimilarities because the color charge is non-abelian, whereas the electric charge is abelian. Outside a finite volume of QGP the color electric field is not screened, so that volume of QGP must still be color-neutral. It will therefore, like a nucleus, have integer electric charge.

How the QGP is studied theoretically

One consequence of this difference is that the color charge is too large for perturbative computations which are the mainstay of QED. As a result, the main theoretical tools to explore the theory of the QGP is lattice gauge theory. The transition temperature (approximately 175 MeV) was first predicted by lattice gauge theory. Since then lattice gauge theory has been used to predict many other properties of this kind of matter. The AdS/CFT correspondence is a new interesting conjecture allowing insights in QGP.

How it is created in the lab

The QGP can be created by heating matter up to a temperature of 2×1012 kelvin, which amounts to 175 MeV per particle. This can be accomplished by colliding two large nuclei at high energy (note that 175 MeV is not the energy of the colliding beam). Lead and gold nuclei have been used for such collisions at CERN SPS and BNL RHIC, respectively. The nuclei are accelerated to ultrarelativistic speeds and slammed into each other while Lorentz contracted. They largely pass through each other, but a resulting hot volume called a fireball is created after the collision. Once created, this fireball is expected to expand under its own pressure, and cool while expanding. By carefully studying this flow, experimentalists hope to put the theory to test.

How the QGP fits into the general scheme of physics

QCD is one part of the modern theory of particle physics called the Standard Model. Other parts of this theory deal with electroweak interactions and neutrinos. The theory of electrodynamics has been tested and found correct to a few parts in a trillion. The theory of weak interactions has been tested and found correct to a few parts in a thousand. Perturbative aspects of QCD have been tested to a few percent. In contrast, non-perturbative aspects of QCD have barely been tested. The study of the QGP is part of this effort to consolidate the grand theory of particle physics.
The study of the QGP is also a testing ground for finite temperature field theory, a branch of theoretical physics which seeks to understand particle physics under conditions of high temperature. Such studies are important to understand the early evolution of our universe: the first hundred microseconds or so. While this may seem esoteric, this is crucial to the physics goals of a new generation of observations of the universe (WMAP and its successors). It is also of relevance to Grand Unification Theories or 'GUTS' which seek to unify the four fundamental forces of nature.

Expected properties

Thermodynamics

The cross-over temperature from the normal hadronic to the QGP phase is about 175 MeV, corresponding to an energy density of a little less than 1 GeV/fm3. For relativistic matter, pressure and temperature are not independent variables, so the equation of state is a relation between the energy density and the pressure. This has been found through lattice computations, and compared to both perturbation theory and string theory. This is still a matter of active research. Response functions such as the specific heat and various quark number susceptibilities are currently being computed.

Flow

The equation of state is an important input into the flow equations. The speed of sound is currently under investigation in lattice computations. The mean free path of quarks and gluons has been computed using perturbation theory as well as string theory. Lattice computations have been slower here, although the first computations of transport coefficients have recently been concluded. These indicate that the mean free time of quarks and gluons in the QGP may be comparable to the average interparticle spacing: hence the QGP is a liquid as far as its flow properties go. This is very much an active field of research, and these conclusions may evolve rapidly. The incorporation of dissipative phenomena into hydrodynamics is another recent development that is still in an active stage.

Excitation spectrum

Does the QGP really contain (almost) free quarks and gluons? The study of thermodynamic and flow properties would indicate that this is an over-simplification. Many ideas are currently being evolved and will be put to test in the near future. It has been hypothesized recently that some mesons built from heavy quarks (such as the charm quark) do not dissolve until the temperature reaches about 350 MeV. This has led to speculation that many other kinds of bound states may exist in the plasma. Some static properties of the plasma (similar to the Debye screening length) constrain the excitation spectrum.

Experimental situation

Those aspects of the QGP which are easiest to compute are not the ones which are the easiest to probe in experiments. While the balance of evidence points towards the QGP being the origin of the detailed properties of the fireball produced in the RHIC, this is the main barrier which prevents experimentalists from declaring a sighting of the QGP. For a summary see 2005 RHIC Assessment.
The important classes of experimental observations are

Formation of quark matter

In April 2005, formation of quark matter was tentatively confirmed by results obtained at Brookhaven National Laboratory's Relativistic Heavy Ion Collider (RHIC). The consensus of the four RHIC research groups was that they had created a quark-gluon liquid of very low viscosity. However, contrary to what was at that time still the widespread assumption, it is yet unknown from theoretical predictions whether the QCD "plasma", especially close to the transition temperature, should behave like a gas or liquid[8]. Authors favoring the weakly interacting interpretation derive their assumptions from the lattice QCD calculation, where the entropy density of quark-gluon plasma approaches the weakly interacting limit. However, since both energy density and correlation shows significant deviation from the weakly interacting limit, it has been pointed out by many authors that there is in fact no reason to assume a QCD "plasma" close to the transition point should be weakly interacting, like electromagnetic plasma (see, e.g., [9]).

See also

References


External links