Monday, March 22, 2010

A first look at the Earth interior from the Gran Sasso underground laboratory

The Gran Sasso National Laboratory (LNGS) is one of four INFN national laboratories.
It is the largest underground laboratory in the world for experiments in particle physics, particle astrophysics and nuclear astrophysics. It is used as a worldwide facility by scientists, presently 750 in number, from 22 different countries, working at about 15 experiments in their different phases.

It is located between the towns of L'Aquila and Teramo, about 120 km from Rome
.
The underground facilities are located on a side of the ten kilometres long freeway tunnel crossing the Gran Sasso Mountain. They consist of three large experimental halls, each about 100 m long, 20 m wide and 18 m high and service tunnels, for a total volume of about 180,000 cubic metres.
***
Slide by Takaaki Kajita
In June 1998 the Super-Kamiokande collaboration revealed its eagerly anticipated results on neutrino interactions to 400 physicists at the Neutrino ’98 conference in Takayama, Japan. A hearty round of applause marked the end of a memorable presentation by Takaaki Kajita of the University of Tokyo that included this slide. He presented strong evidence that neutrinos behave differently than predicted by the Standard Model of particles: The three known types of neutrinos apparently transform into each other, a phenomenon known as oscillation.

Super-K’s detector, located 1000 meters underground, had collected data on neutrinos produced by a steady stream of cosmic rays hitting the Earth’s atmosphere. The data allowed scientists to distinguish between two types of atmospheric neutrinos: those that produce an electron when interacting with matter (e-like), and those that produce a muon (μ-like). The graph in this slide shows the direction the neutrinos came from (represented by cos theta, on the x-axis); the number of neutrinos observed (points marked with crosses); and the number expected according to the Standard Model (shaded boxes).

In the case of the μ-like neutrinos, the number coming straight down from the sky into the detector agreed well with theoretical prediction. But the number coming up through the ground was much lower than anticipated. These neutrinos, which originated in the atmosphere on the opposite side of the globe, travelled 13,000 kilometers through the Earth before reaching the detector. The long journey gave a significant fraction of them enough time to “disappear”—shedding their μ-like appearance by oscillating into a different type of neutrino. While earlier experiments had pointed to the possibility of neutrino oscillations, the disappearance of μ-like neutrinos in the Super-K experiment provided solid evidence.
***
Click on this BlogTitled link



The Borexino Collaboration announced the observation of geo-neutrinos at the underground Gran Sasso National Laboratory of Italian Institute for Nuclear Physics (INFN), Italy. The data reveal, for the first time, a definite anti-neutrino signal with the expected energy spectrum due to radioactive decays of U and Th in the Earth well above background.

The International Borexino Collaboration, with institutions from Italy, US, Germany, Russia, Poland and France, operates a 300-ton liquid-scintillator detector designed to observe and study low-energy solar neutrinos. The low background of the Borexino detector has been key to the detection of geo-neutrinos. Technologies developed by Borexino Collaborators have achieved very low background levels. The central core of the Borexino scintillator is now the lowest background detector available for these observations. The ultra-low background of Borexino was developed to make the first measurements of solar neutrinos below 1 MeV and has now produced this first, firm observation of geo-neutrinos.

Geo-neutrinos are anti-neutrinos produced in radioactive decays of naturally occurring Uranium, Thorium, Potassium, and Rubidium. Decays from these radioactive elements are believed to contribute a significant but unknown fraction of the heat generated inside our planet. The heat generates convective movements in the Earth's mantle that influence volcanic activity and tectonic plate movements inducing seismic activity, and the geo-dynamo that creates the Earth's magnetic field.
More above......

***
Links borrowed from here

Browsing experiments
 • auger (7 photos)
 • borexino (6 photos)
 • cobra (6 photos)
 • cresst (5 photos)
 • cryostem (2 photos)
 • cuore (5 photos)
 • cuoricino (3 photos)
 • dama (9 photos)
 • eastop (4 photos)
 • ermes (2 photos)
 • genius (3 photos)
 • gerda (1 photos)
 • gigs (3 photos)
 • gno (6 photos)
 • hdms (2 photos)
 • hmbb (1 photos)
 • icarus (19 photos)
 • lisa (1 photos)
 • luna (5 photos)
 • lvd (4 photos)
 • macro (4 photos)
 • mibeta (1 photos)
 • opera (26 photos)
 • tellus (1 photos)
 • underseis (8 photos)
 • vip (1 photos)
 • warp (10 photos)
 • xenon (4 photos)
 • zoo (3 photos)


*** 
See Also:

The Law of Octaves


Dmitri Ivanovich Mendeleev (also romanized Mendeleyev or Mendeleef; Russian: Дми́трий Ива́нович Менделе́ев About this sound listen ) (8 February [O.S. 27 January] 1834 – 2 February [O.S. 20 January] 1907), was a Russian chemist and inventor. He is credited as being the creator of the first version of the periodic table of elements. Using the table, he predicted the properties of elements yet to be discovered.

This post was inspired by the "Poll: Do you believe in extraterrestrial life?"

I was thinking about Hoyle and CNO and of course about Lee Smolin. As you get older it is harder for me to retain all that proceeded each discussion, so linking helps to refresh.

Carbon is the 15th most abundant element in the Earth's crust, and the fourth most abundant element in the universe by mass after hydrogen, helium, and oxygen. It is present in all known lifeforms, and in the human body carbon is the second most abundant element by mass (about 18.5%) after oxygen.[14] This abundance, together with the unique diversity of organic compounds and their unusual polymer-forming ability at the temperatures commonly encountered on Earth, make this element the chemical basis of all known life.See:Carbon

It was necessary to recall the links from one to the other, to show how one's perception "about Carbon was drawn" into the discussion about what life in the cosmos is based on.



I am partial to the allotropic expression on a physical scale by something that was understood as the Law of Octaves.

Carbon forms the backbone of biology for all life on Earth. Complex molecules are made up of carbon bonded with other elements, especially oxygen, hydrogen and nitrogen. It is these elements that living organisms need, among others, and carbon is able to bond with all of these because of its four valence electrons. Since no life has been observed that is not carbon-based, it is sometimes assumed in astrobiology that life elsewhere in the universe will also be carbon-based. This assumption is referred to by critics as carbon chauvinism, as it may be possible for life to form that is not based on carbon, even though it has never been observed.See:Carbon-based life


For sure each round of discussion on the topic leads too... and there are ideas with which one can ask.  I am in no way advocating anything here in terms of AP other then to remember discussions about this very poll question before.  

The triple alpha process is highly dependent on carbon-12 having a resonance with the same energy as helium-4 and beryllium-8 and before 1952 no such energy level was known. It was astrophysicist Fred Hoyle who used the fact that carbon-12 is so abundant in the universe (and that our existence depends upon it - the Anthropic Principle), as evidence for the existence of the carbon-12 resonance. Fred suggested the idea to nuclear physicist Willy Fowler, who conceded that it was possible that this energy level had been missed in previous work on carbon-12. After a brief undertaking by his research group, they discovered a resonance near to 7.65 Mev.See:Triple-alpha process

*** 
 

Mass spectrometers are analytical instruments that determine atomic and molecular masses with great accuracy. Low-pressure vapors of elements or molecules are hit by a beam of rapidly moving electrons. The collision knocks an electron off the sample atom or molecule, leaving it positively charged.
These newly-formed ions are accelerated out of the ionization chamber by an electric field. The speeds to which the ions can be accelerated by the electric field are determined by their masses. Lighter ions can go faster than heavier ones.


Ion's path bent by external magnetic field
Ion's path bent by external magnetic field
Courtesy: McREL
The beam of positively-charged ions generates a slight magnetic field that interacts with an externally-applied magnetic field. The net result is that the trajectory of a charged particle is curved to an extent that depends on its speed (determined by its mass). When the beam of a mixture of isotopes of different masses falls on a photographic plate, the different isotopes converge at different points, corresponding to the different radii of their semicircular paths.
The mathematical equation that describes this phenomenon is: m/e = H2 r2 /2V, where m is the mass of the ion, e is the charge of the ion, H is the magnetic field strength, r is the radius of the semicircle, and V is the accelerating potential.
***
See Also:

Friday, March 19, 2010

Neutrinoless Double Beta Decay

You don’t see what you’re seeing until you see it,” Dr. Thurston said, “but when you do see it, it lets you see many other things.Elusive Proof, Elusive Prover: A New Mathematical Mystery


The Enriched Xenon Observatory is an experiment in particle physics aiming to detect "neutrino-less double beta decay" using large amounts of xenon isotopically enriched in the isotope 136. A 200-kg detector using liquid Xe is currently being installed at the Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico. Many research and development efforts are underway for a ton-scale experiment, with the goal of probing new physics and the mass of the neutrino. The Enriched Xenon Observatory

***
Feynman diagram of neutrinoless double-beta decay, with two neutrons decaying to two protons. The only emitted products in this process are two electrons, which can only occur if the neutrino and antineutrino are the same particle (i.e. Majorana neutrinos) so the same neutrino can be emitted and absorbed within the nucleus. In conventional double-beta decay, two antineutrinos - one arising from each W vertex - are emitted from the nucleus, in addition to the two electrons. The detection of neutrinoless double-beta decay is thus a sensitive test of whether neutrinos are Majorana particles.


Neutrinoless double-beta decay experiments

Numerous experiments have been carried out to search for neutrinoless double-beta decay. Some recent and proposed future experiments include:

See:Direct Dark Matter Detection


 See Also: South Dakota's LUX will join the dark matter wars

Tuesday, March 16, 2010

Frogs, Foam and Fuel: UC Researchers Convert Solar Energy to Sugars



Photos By: Illustration by Megan Gundrum, fifth-year DAAP student

For decades, farmers have been trying to find ways to get more energy out of the sun.

In natural photosynthesis, plants take in solar energy and carbon dioxide and then convert it to oxygen and sugars. The oxygen is released to the air and the sugars are dispersed throughout the plant — like that sweet corn we look for in the summer. Unfortunately, the allocation of light energy into products we use is not as efficient as we would like. Now engineering researchers at the University of Cincinnati are doing something about that.
See:Frogs, Foam and Fuel: UC Researchers Convert Solar Energy to Sugars

***

I get very excited when I see ideas like this.


See:When It Comes to Photosynthesis, Plants Perform Quantum Computation
Plants soak up some of the 1017 joules of solar energy that bathe Earth each second, harvesting as much as 95 percent of it from the light they absorb. The transformation of sunlight into carbohydrates takes place in one million billionths of a second, preventing much of that energy from dissipating as heat. But exactly how plants manage this nearly instantaneous trick has remained elusive. Now biophysicists at the University of California, Berkeley, have shown that plants use the basic principle of quantum computing—the exploration of a multiplicity of different answers at the same time—to achieve near-perfect efficiency.

Biophysicist Gregory Engel and his colleagues cooled a green sulfur bacterium—Chlorobium tepidum, one of the oldest photosynthesizers on the planet—to 77 kelvins [–321 degrees Fahrenheit] and then pulsed it with extremely short bursts of laser light. By manipulating these pulses, the researchers could track the flow of energy through the bacterium's photosynthetic system. "We always thought of it as hopping through the system, the same way that you or I might run through a maze of bushes," Engel explains. "But, instead of coming to an intersection and going left or right, it can actually go in both directions at once and explore many different paths most efficiently."

In other words, plants are employing the basic principles of quantum mechanics to transfer energy from chromophore (photosynthetic molecule) to chromophore until it reaches the so-called reaction center where photosynthesis, as it is classically defined, takes place. The particles of energy are behaving like waves. "We see very strong evidence for a wavelike motion of energy through these photosynthetic complexes," Engel says. The results appear in the current issue of Nature.

QUANTUM CHLOROPHYLL: Sunlight triggers wave-like motion in green chlorophyll, embedded in a protein structure, depicted in gray here, that guides its function. GREGORY ENGEL

Employing this process allows the near-perfect efficiency of plants in harvesting energy from sunlight and is likely to be used by all of them, Engel says. It might also be copied usefully by researchers attempting to create artificial photosynthesis, such as that in photovoltaic cells for generating electricity. "This can be a much more efficient energy transfer than a classical hopping one," Engel says. "Exactly how to implement that is a very difficult question."

It also remains unclear exactly how a plant's structure permits this quantum effect to take place. "[The protein structure] of the plant has to be tuned to allow transfer among chromophores but not to allow transfers into [heat]," Engel says. "How that tuning works and how it is controlled, we don't know." Inside every spring leaf is a system capable of performing a speedy and efficient quantum computation, and therein lies the key to much of the energy on Earth.

***

Wednesday, March 03, 2010

Neutron interferometer

Lubos Motl:
You have completely misunderstood the neutron gravitational interference experiment. They showed that the force acting on the neutron is simply not negligible. Quite on the contrary, these interference experiments could measure and did measure the gravitational acceleration - and even the tidal forces - on the phase shift of the neutron's wave function. It's the very point of these experiments.

So whatever theory predicts that such forces are "negligible" is instantly falsified.
***
From Wikipedia
In physics, a neutron interferometer is an interferometer capable of diffracting neutrons, allowing the wave-like nature of neutrons, and other related phenomena, to be explored.
Interferometry inherently depends on the wave nature of the object. As pointed out by de Broglie in his PhD-thesis, particles, including neutrons, can behave like waves (the so called wave-particle duality, now explained in the general framework of quantum mechanics). The wave functions of the individual interferometer paths are created and recombined coherently which needs the application of dynamical theory of diffraction. Neutron interferometers are the counterpart of X-ray interferometers and are used to study quantities or benefits related to thermal neutron radiation.

Neutron interferometers are used to determine minute quantum-mechanical effects to the neutron wave, such as studies of the
  • Aharonov-Bohm effect
  • gravity acting on an elementary particle, the neutron
  • rotation of the earth acting on a quantum system
they can be applied for
Like X-ray interferometers, neutron interferometers are typically carved from a single large crystal of silicon, often 10 to 30 or more centimeters in diameter and 20 to 60 or more in length. Modern semiconductor technology allows large single-crystal silicon boules to be easily grown. Since the boule is a single crystal, the atoms in the boule are precisely aligned, to within small fractions of a nanometer or an angstrom, over the entire boule. The interferometer is created by carving away all but three slices of silicon, held in perfect alignment by a base. (image) Neutrons impinge on the first slice, where, by diffraction from the crystalline lattice, they separate into two beams. At the second slice, they are diffracted again, with two beams continuing on to the third slice. At the third slice, the beams recombine, interfering constructively or destructively, completing the interferometer. Without the precise, angstrom-level alignment of the three slices, the interference results would not be meaningful.

Only recently, a neutron interferometer for cold and ultracold neutrons was designed and successfully run. As neutron optical components in this case three artificial holographically produced, i.e., by means of a light optic two wave interference setup illuminating a photo-neutronrefractive polymer, gratings are employed.

References

V. F. Sears, Neutron Optics, Oxford University Press (1998).
H. Rauch and S. A. Werner, Neutron Interferometry, Clarendon Press, Oxford (2000).

***

On the Origin of Gravity and the Laws of Newton

Starting from first principles and general assumptions Newton's law of gravitation is shown to arise naturally and unavoidably in a theory in which space is emergent through a holographic scenario. Gravity is explained as an entropic force caused by changes in the information associated with the positions of material bodies. A relativistic generalization of the presented arguments directly leads to the Einstein equations. When space is emergent even Newton's law of inertia needs to be explained. The equivalence principle leads us to conclude that it is actually this law of inertia whose origin is entropic.

***


 


The Neutron Interferometry and Optics Facility (NIOF) located in the NIST Center for Neutron Research Guide Hall is one of the world's premier user facilities for neutron interferometry and related neutron optical measurements. A neutron interferometer (NI) splits, then recombines neutron waves. This gives the NI its unique ability to experimentally access the phase of neutron waves. Phase measurements are used to study the magnetic, nuclear, and structural properties of materials, as well fundamental questions in quantum physics. Related, innovative neutron optical techniques for use in condensed matter and materials science research are being developed.
 ***
 


Neutron Interferometer. A three blade neutron interferometer, machined from a single crystal silicon ingot is shown in two views. A monoenergetic neutron beam is split by the first blade and recombined in the third blade. If a sample is introduced in one of the paths, a phase difference in the wave function is produced, and interference between the recombined beams causes count rate shifts of opposite sign in the two detectors.
The Neutron Interferometer Facility in the Cold Neutron Guide Hall became operational in April 1994. It became available as a National User Facility in September 1996. Phase contrast of up to 88 percent and phase stability of better than five milliradians per day were observed. These performance indications are primarily the result of the advanced vibration isolation and environmental control systems. The interferometer operates inside a double walled enclosure, with the inner room built on a 40,000 kg slab which floats on pneumatic pads above an isolated foundation.



See AlsoGravity is Entropy is Gravity is...

Sunday, February 28, 2010

Trivium:Three Roads


 
Logic is the art of thinking; grammar, the art of inventing symbols and combining them to express thought; and rhetoric, the art of communicating thought from one mind to another, the adaptation of language to circumstance.Sister Miriam Joseph

 Painting by Cesare Maccari (1840-1919), Cicero Denounces Catiline.

In medieval universities, the trivium comprised the three subjects taught first: grammar, logic, and rhetoric. The word is a Latin term meaning “the three ways” or “the three roads” forming the foundation of a medieval liberal arts education. This study was preparatory for the quadrivium. The trivium is implicit in the De nuptiis of Martianus Capella, although the term was not used until the Carolingian era when it was coined in imitation of the earlier quadrivium.[1] It was later systematized in part by Petrus Ramus as an essential part of Ramism.


Formal grammar

A formal grammar (sometimes simply called a grammar) is a set of rules of a specific kind, for forming strings in a formal language. The rules describe how to form strings from the language's alphabet that are valid according to the language's syntax. A grammar does not describe the meaning of the strings or what can be done with them in whatever context —only their form.

Formal language theory, the discipline which studies formal grammars and languages, is a branch of applied mathematics. Its applications are found in theoretical computer science, theoretical linguistics, formal semantics, mathematical logic, and other areas.

A formal grammar is a set of rules for rewriting strings, along with a "start symbol" from which rewriting must start. Therefore, a grammar is usually thought of as a language generator. However, it can also sometimes be used as the basis for a "recognizer"—a function in computing that determines whether a given string belongs to the language or is grammatically incorrect. To describe such recognizers, formal language theory uses separate formalisms, known as automata theory. One of the interesting results of automata theory is that it is not possible to design a recognizer for certain formal languages.

Parsing is the process of recognizing an utterance (a string in natural languages) by breaking it down to a set of symbols and analyzing each one against the grammar of the language. Most languages have the meanings of their utterances structured according to their syntax—a practice known as compositional semantics. As a result, the first step to describing the meaning of an utterance in language is to break it down part by part and look at its analyzed form (known as its parse tree in computer science, and as its deep structure in generative grammar).
Logic

As a discipline, logic dates back to Aristotle, who established its fundamental place in philosophy. The study of logic is part of the classical trivium.

Averroes defined logic as "the tool for distinguishing between the true and the false"[4]; Richard Whately, '"the Science, as well as the Art, of reasoning"; and Frege, "the science of the most general laws of truth". The article Definitions of logic provides citations for these and other definitions.

Logic is often divided into two parts, inductive reasoning and deductive reasoning. The first is drawing general conclusions from specific examples, the second drawing logical conclusions from definitions and axioms. A similar dichotomy, used by Aristotle, is analysis and synthesis. Here the first takes an object of study and examines its component parts, the second considers how parts can be combined to form a whole.
Logic is also studied in argumentation theory.[5]


Tuesday, February 23, 2010

Calorimetric Equivalence Principle Test

With Stefan shutting down the blog temporary I thought to gather my thoughts here.

Gravitomagnetism

This approximate reformulation of gravitation as described by general relativity makes a "fictitious force" appear in a frame of reference different from a moving, gravitating body. By analogy with electromagnetism, this fictitious force is called the gravitomagnetic force, since it arises in the same way that a moving electric charge creates a magnetic field, the analogous "fictitious force" in special relativity. The main consequence of the gravitomagnetic force, or acceleration, is that a free-falling object near a massive rotating object will itself rotate. This prediction, often loosely referred to as a gravitomagnetic effect, is among the last basic predictions of general relativity yet to be directly tested.
Indirect validations of gravitomagnetic effects have been derived from analyses of relativistic jets. Roger Penrose had proposed a frame dragging mechanism for extracting energy and momentum from rotating black holes.[2] Reva Kay Williams, University of Florida, developed a rigorous proof that validated Penrose's mechanism.[3] Her model showed how the Lense-Thirring effect could account for the observed high energies and luminosities of quasars and active galactic nuclei; the collimated jets about their polar axis; and the asymmetrical jets (relative to the orbital plane).[4] All of those observed properties could be explained in terms of gravitomagnetic effects.[5] Williams’ application of Penrose's mechanism can be applied to black holes of any size.[6] Relativistic jets can serve as the largest and brightest form of validations for gravitomagnetism.
A group at Stanford University is currently analyzing data from the first direct test of GEM, the Gravity Probe B satellite experiment, to see if they are consistent with gravitomagnetism.


A group at Stanford University is currently analyzing data from the first direct test of GEM, the Gravity Probe B satellite experiment, to see if they are consistent with gravitomagnetism.

While I am not as progressed in terms of the organization of your thought process(inexperience in terms of the education) I am holding the ideas of Mendeleev in mind as I look at this topic you've gathered. And Newton as well, but not in the way one might have deferred to as the basis if gravity research.

It is more on the idea of what we can create in reality given all the elements at our disposal. This is also the same idea in mathematics that all the information is there and only has t be discovered. Such a hierarchy in thinking is also the idea of geometrical presence stretched to higher dimensions, as one would point to mater assmptins as t a higher order preset in the development of the material of earth as to the planet.

***

Uncle Al,

Overview:A parity calorimetry test offers a 33,000-fold improvement in EP anomaly sensitivity in only two days of measurements.

we are not so different....that this quest may not be apparent for many, yet it is a simple question about what is contracted to help understand "principals of formation" had been theoretically developed in terms of the genus figures(Stanley Mandelstam) that we understand that this progression mathematically has been slow.

So we scientifically build this experimental progression.

But indeed, it's a method in terms of moving from "the false vacuum to the true?" What is the momentum called toward materialization?

Such an emergent feature while discussing some building block model gives some indication of a "higher order principal" that is not clearly understood, while from a condense matter theorist point of view, this is a emergent feature?

Best,

Bordeaux, France is 44.83 N

http://www.mazepath.com/uncleal/lajos.htm#b7
***

According to general relativity, the gravitational field produced by a rotating object (or any rotating mass-energy) can, in a particular limiting case, be described by equations that have the same form as the magnetic field in classical electromagnetism. Starting from the basic equation of general relativity, the Einstein field equation, and assuming a weak gravitational field or reasonably flat spacetime, the gravitational analogs to Maxwell's equations for electromagnetism, called the "GEM equations", can be derived. GEM equations compared to Maxwell's equations in SI are:[7] [8][9][10]

GEM equations Maxwell's equations
 \nabla \cdot \mathbf{E}_\text{g} = -4 \pi G \rho \  \nabla \cdot \mathbf{E} =  \frac{\rho_\text{em}}{\epsilon_0} \
 \nabla \cdot \mathbf{B}_\text{g} = 0 \  \nabla \cdot \mathbf{B} = 0 \
 \nabla \times \mathbf{E}_\text{g} = -\frac{\partial \mathbf{B}_\text{g} } {\partial t} \  \nabla \times \mathbf{E} = -\frac{\partial \mathbf{B} } {\partial t} \
 \nabla \times \mathbf{B}_\text{g} = -\frac{4 \pi G}{c^2} \mathbf{J} + \frac{1}{c^2} \frac{\partial \mathbf{E}_\text{g}} {\partial t}  \nabla \times \mathbf{B} = \frac{1}{\epsilon_0 c^2} \mathbf{J}_\text{em} + \frac{1}{c^2} \frac{\partial \mathbf{E}} {\partial t}

where:

Monday, February 22, 2010

Physicists Discover How to Entangle at High Temperatures

While I do not just like to echo in the world of information it is important to me to see how we can use entanglement to give us information about quantum gravity. Is it possible?


Entanglement is the weird quantum process in which two objects share the same existence. So a measurement on one object immediately influences the other, not matter how far apart they may be.
Entanglement is a strange and fragile thing. Sneeze and it vanishes. The problem is that entanglement is destroyed by any interaction with the environment and these interactions are hard to prevent. So physicists have only ever been able to study and exploit entanglement in systems that do not interact easily with the environment, such as photons, or at temperatures close to absolute zero where the environment becomes more benign.

In fact, physicists believe that there is a fundamental limit to the thermal energies at which entanglement can be usefully exploited. And this limit is tiny, comparable to very lowest temperatures.
Today, Fernando Galve at the University of the Balearic Islands in Spain and a few buddies, show how this limit can be dramatically increased. The key behind their idea is the notion of a squeezed state.
In quantum mechanics, Heisenberg's uncertainty principle places important limits on how well certain pairs of complementary properties can be observed. For example, the more accurately you measure position, the less well you can determine momentum. The same is true of energy and time and also of the phase and amplitude of a quantum state.

Physicists have learnt how to play around with these complementary observables to optimise the way they make measurements. They've discovered that they can trade their knowledge of one complementary observable for an improvement in the other. See more here:Physicists Discover How to Entangle at High Temperatures

Saturday, February 20, 2010

Economy, as Science


A shift in paradigm can lead, via the theory-dependence of observation, to a difference in one's experiences of things and thus to a change in one's phenomenal world.ON Thomas Kuhn
 
Control the information you control the people?:) Again as heart felt and idealistic you can become in your efforts, it's not enough to cry out in political verbiage because you'll always end up with another person saying that it is only a political perspective. That it is the progressive conservative you don't like and their leader? It's not enough.

So what do you do?

Do you succumb to the frustration that what is moving as a sub-culture working from the inside/out, is the idea that you can build a better consensus from what is moving the fabric of society to know that we can change the outcome as to what Canada shall become as well?

They( a conspirator thought) as a force that is undermining the public perception while society did not grasp the full understanding of what has been done to them. Society having been cast to fighting at the "local level to advance a larger agenda?"

Does it not seem that once you occupy the mind in such close quarter conflagrations that mind has been circumvented from the larger picture?

Pain, and emotional turmoil does this.

Historically once the fire has been started, like some phoenix, a new cultural idealism manifests as to what the individual actually wants when they are in full recognition that "as a force" moved forward in a democratic compunction as a government in waiting to advance the principals by which it can stand as the public mind.


However, the incommensurability thesis is not Kuhn's only positive philosophical thesis. Kuhn himself tells us that “The paradigm as shared example is the central element of what I now take to be the most novel and least understood aspect of [The Structure of Scientific Revolutions]” (1970a, 187). Nonetheless, Kuhn failed to develop the paradigm concept in his later work beyond an early application of its semantic aspects to the explanation of incommensurability. The explanation of scientific development in terms of paradigms was not only novel but radical too, insofar as it gives a naturalistic explanation of belief-change. Naturalism was not in the early 1960s the familiar part of philosophical landscape that it has subsequently become. Kuhn's explanation contrasted with explanations in terms of rules of method (or confirmation, falsification etc.) that most philosophers of science took to be constitutive of rationality. Furthermore, the relevant disciplines (psychology, cognitive science, artificial intelligence) were either insufficiently progressed to support Kuhn's contentions concerning paradigms, or were antithetical to them (in the case of classical AI). Now that naturalism has become an accepted component of philosophy, there has recently been interest in reassessing Kuhn's work in the light of developments in the relevant sciences, many of which provide corroboration for Kuhn's claim that science is driven by relations of perceived similarity and analogy to existing problems and their solutions (Nickles 2003b, Nersessian 2003). It may yet be that a characteristically Kuhnian thesis will play a prominent part in our understanding of science.
I would advance that the word "science" in quote above, be changed to "economy."

What paradigmatic solution has been advanced that such a thing can turn over the present equatorial function assigned to the pubic mind, that we will be in better control of our destinies as Canadians?

Precursor to such changes are revolutions in the thought patterns established as functionary pundits of money orientated societies. They have become "fixed to a particular agenda." Rote systems assumed and brought up in,  extolled as to the highest moral obligation is  to live well, and on the way, fix ourselves to debt written obligations that shall soon over come the sensibility of what it shall take to live?

Force upon them is the understanding that we had become a slave to our reason and a slave to a master disguised as what is healthy and knows no boundaries? A capitalistic dream.

Update:

Money Supply and Energy: Is The Economy Inherently Unstable?