Friday, April 23, 2010

Solar Dynamics Observatory


SpaceCraft
  • The total mass of SDO at launch was 3000 kg (6620 lb); instruments 300 kg (660 lb), spacecraft 1300 kg (2870 lb), and fuel 1400 kg (3090 lb).
  • Its overall length along the sun-pointing axis is 4.5 m, and each side is 2.22 m.
  • The span of the extended solar panels is 6.25 m.
  • Total available power is 1500 W from 6.6 m2 of solar arrays operating at an efficiency of 16%
  • The high-gain antennas rotate once each orbit to follow the Earth.


***
April 21, 2010: Warning, the images you are about to see could take your breath away.
At a press conference today in Washington DC, researchers unveiled "First Light" images from NASA's Solar Dynamics Observatory, a space telescope designed to study the sun.


"SDO is working beautifully," reports project scientist Dean Pesnell of the Goddard Space Flight Center. "This is even better than we could have dreamed."


Launched on February 11th from Cape Canaveral, the observatory has spent the past two months moving into a geosynchronous orbit and activating its instruments. As soon as SDO's telescope doors opened, the spacecraft began beaming back scenes so beautiful and puzzlingly complex that even seasoned observers were stunned.
Source for story here


***
NASA's New Eye on the Sun Delivers Stunning First Images
04.21.10
View related briefing materials here.

NASA's recently launched Solar Dynamics Observatory, or SDO, is returning early images that confirm an unprecedented new capability for scientists to better understand our sun’s dynamic processes. These solar activities affect everything on Earth.

Some of the images from the spacecraft show never-before-seen detail of material streaming outward and away from sunspots. Others show extreme close-ups of activity on the sun’s surface. The spacecraft also has made the first high-resolution measurements of solar flares in a broad range of extreme ultraviolet wavelengths.

"These initial images show a dynamic sun that I had never seen in more than 40 years of solar research,” said Richard Fisher, director of the Heliophysics Division at NASA Headquarters in Washington. "SDO will change our understanding of the sun and its processes, which affect our lives and society. This mission will have a huge impact on science, similar to the impact of the Hubble Space Telescope on modern astrophysics.”




(From NASA:) A full-disk multiwavelength extreme ultraviolet image of the sun taken by SDO on March 30, 2010. False colors trace different gas temperatures. Reds are relatively cool (about 60,000 Kelvin, or 107,540 F); blues and greens are hotter (greater than 1 million Kelvin, or 1,799,540 F). Credit: NASA
(From NASA:) A full-disk multiwavelength extreme ultraviolet image of the sun taken by SDO on March 30, 2010. False colors trace different gas temperatures. Reds are relatively cool (about 60,000 Kelvin, or 107,540 F); blues and greens are hotter (greater than 1 million Kelvin, or 1,799,540 F). Credit: NASA

Source of Picture is taken from here

Monday, March 29, 2010

Linking Experiments

Illustration: Sandbox Studio

 

The first round of physics

Nine proposals are under consideration for the initial suite of physics experiments at DUSEL, and scientists have received $21 million in NSF funding to refine them. The proposals cover four areas of research:
  • What is the nature of dark matter? (Proposals for LZ3, COUPP, GEODM, and MAX)
  • Are neutrinos their own antiparticles? (Majorana, EXO)
  • How do stars create the heavy elements? (DIANA)
  • What role did neutrinos play in the evolution of the universe? (LBNE)
In addition, scientists propose to build a generic underground facility (FAARM) that will monitor the mine's naturally occurring radioactivity, which can interfere with the search for dark matter. The facility also would measure particle emissions from various materials, and help develop and refine technologies for future underground physics experiments.
But why are there four separate proposals for how to search for dark matter? Not knowing the nature of dark-matter particles and their interactions with ordinary matter, scientists would like to use a variety of detector materials to look for the particles and study their interactions with atoms of different sizes. The use of different technologies would also provide an independent cross check of the experimental results.
"We strongly feel we need two or more experiments," says Bernard Sadoulet of UC Berkeley, an expert on dark-matter searches. "If money were not an issue, you would build at least three experiments."
The largest experiment intended for DUSEL is the Long-Baseline Neutrino Experiment (see graphic), a project that involves both the DOE and NSF. Scientists would use the LBNE to explore whether neutrinos break one of the most fundamental laws of physics: the symmetry between matter and antimatter. In 1980, James Cronin and Val Fitch received the Nobel Prize for the observation that quarks can violate this symmetry. But the effect is too small to explain the dominance of matter over antimatter in our universe. Neutrinos might be the answer.
The LBNE scientists would generate a high-intensity neutrino beam at DOE's Fermi National Accelerator Laboratory, 800 miles east of Homestake, and aim it straight through the Earth at two or more enormous neutrino detectors in the DUSEL mine, each containing the equivalent of 100,000 tons of water.
Studies have shown that the rock at the 4850-foot level of the mine would support the safe construction of these caverns. In January, the LBNE experiment received first-stage approval, also known as Mission Need, from the DOE.
Lesko and his team now are combining all engineering studies and science proposals into an overall proposal for review.
"By the end of this summer, we hope to complete a preliminary design of the DUSEL facility and then integrate it with a generic suite of experiments," Lesko says. "While formal selection of the experiments will not have been made by that time, we know enough about them now that we can move forward with the preliminary design. The experiments themselves will be selected through a peer-review process, as is common in the NSF."
If all goes well, Lesko says, scientists and engineers could break ground on the major DUSEL excavations in 2013, marking the start of a new era for deep underground research in the United States. SEE:Big Plans for Deep Science

See Also: 

Thursday, March 25, 2010

Mapping the Internet Brain and Consciousness

Partial map of the Internet based on the January 15, 2005 data found on opte.org. Each line is drawn between two nodes, representing two IP addresses. The length of the lines are indicative of the delay between those two nodes. This graph represents less than 30% of the Class C networks reachable by the data collection program in early 2005. Lines are color-coded according to their corresponding RFC 1918 allocation as follows:
  • Dark blue: net, ca, us
  • Green: com, org
  • Red: mil, gov, edu
  • Yellow: jp, cn, tw, au, de
  • Magenta: uk, it, pl, fr
  • Gold: br, kr, nl
  • White: unknown

I asked a couple of my work mates what they thought this picture was, and right away they thought it was some galaxy. On first look it did not seem any less likely to me either,  until that is of course you have  a read through the design construction listed underneath the picture.


What are Mind Maps?
A mind map is a diagram used to represent words, ideas, tasks, or other items linked to and arranged around a central key word or idea. Mind maps are used to generate, visualize, structure, and classify ideas, and as an aid in study, organization, problem solving, decision making, and writing.

The elements of a given mind map are arranged intuitively according to the importance of the concepts, and are classified into groupings, branches, or areas, with the goal of representing semantic or other connections between portions of information. Mind maps may also aid recall of existing memories.
By presenting ideas in a radial, graphical, non-linear manner, mind maps encourage a brainstorming approach to planning and organizational tasks. Though the branches of a mindmap represent hierarchical tree structures, their radial arrangement disrupts the prioritizing of concepts typically associated with hierarchies presented with more linear visual cues. This orientation towards brainstorming encourages users to enumerate and connect concepts without a tendency to begin within a particular conceptual framework.
The mind map can be contrasted with the similar idea of concept mapping. The former is based on radial hierarchies and tree structures denoting relationships with a central governing concept, whereas concept maps are based on connections between concepts in more diverse patterns.
This is amazing to me in context of mind mapping that is spoken too here on this site.  I thought it appropriate to title the Post above in context of how we are mapping the human brain in terms of neurological connections so as to see this comparative relation to how we might map the internet.

See:Creating a Science of the Web

The Web is the largest human information construct in history. The Web is transforming society. In order to...understand what the Web is engineer its future ensure its social benefit

...we need a new interdisciplinary field that we call Web Science.

The Web Science Research Initiative brings together academics, scientists, sociologists, entrepreneurs and decision makers from around the world. These people will create the first multidisciplinary research body to examine the World Wide Web and offer the practical solutions needed to help guide its future use and design.

As you go through the label on mind map you will discover something about mind mapping  as I have written about it that I see is inherent in the very nature of our makeup. To me this was to be revealed through the quieter times of my introspection, that one might say I was indeed asleep and dreaming.

A contemporary mandala made from a photograph of tree fungi.See:Mandala

So in a way this product of my introspection was about understanding that such structures were immortal in my view that these could be transmitted in consciousness from a time before our birth to have it exploded within our consciousness, as the life began to unfold for us.

That such parcels of thought were given in this context and formed to allow a time for thought to be expelled from the very understanding that this was somehow "a seed to germinate," the longer one thought about it's structure and content.

So too that we could as if natural consequence see that the way the subconscious mind was to organize a "load of information" so as to keep these things in the souls memory for all time. This was the idea that all life experience around us was able to form this distillate view as a function seen in any kaleidescope image, as it is fractionated down to it's simplest form .

See Also :



Monday, March 22, 2010

A first look at the Earth interior from the Gran Sasso underground laboratory

The Gran Sasso National Laboratory (LNGS) is one of four INFN national laboratories.
It is the largest underground laboratory in the world for experiments in particle physics, particle astrophysics and nuclear astrophysics. It is used as a worldwide facility by scientists, presently 750 in number, from 22 different countries, working at about 15 experiments in their different phases.

It is located between the towns of L'Aquila and Teramo, about 120 km from Rome
.
The underground facilities are located on a side of the ten kilometres long freeway tunnel crossing the Gran Sasso Mountain. They consist of three large experimental halls, each about 100 m long, 20 m wide and 18 m high and service tunnels, for a total volume of about 180,000 cubic metres.
***
Slide by Takaaki Kajita
In June 1998 the Super-Kamiokande collaboration revealed its eagerly anticipated results on neutrino interactions to 400 physicists at the Neutrino ’98 conference in Takayama, Japan. A hearty round of applause marked the end of a memorable presentation by Takaaki Kajita of the University of Tokyo that included this slide. He presented strong evidence that neutrinos behave differently than predicted by the Standard Model of particles: The three known types of neutrinos apparently transform into each other, a phenomenon known as oscillation.

Super-K’s detector, located 1000 meters underground, had collected data on neutrinos produced by a steady stream of cosmic rays hitting the Earth’s atmosphere. The data allowed scientists to distinguish between two types of atmospheric neutrinos: those that produce an electron when interacting with matter (e-like), and those that produce a muon (μ-like). The graph in this slide shows the direction the neutrinos came from (represented by cos theta, on the x-axis); the number of neutrinos observed (points marked with crosses); and the number expected according to the Standard Model (shaded boxes).

In the case of the μ-like neutrinos, the number coming straight down from the sky into the detector agreed well with theoretical prediction. But the number coming up through the ground was much lower than anticipated. These neutrinos, which originated in the atmosphere on the opposite side of the globe, travelled 13,000 kilometers through the Earth before reaching the detector. The long journey gave a significant fraction of them enough time to “disappear”—shedding their μ-like appearance by oscillating into a different type of neutrino. While earlier experiments had pointed to the possibility of neutrino oscillations, the disappearance of μ-like neutrinos in the Super-K experiment provided solid evidence.
***
Click on this BlogTitled link



The Borexino Collaboration announced the observation of geo-neutrinos at the underground Gran Sasso National Laboratory of Italian Institute for Nuclear Physics (INFN), Italy. The data reveal, for the first time, a definite anti-neutrino signal with the expected energy spectrum due to radioactive decays of U and Th in the Earth well above background.

The International Borexino Collaboration, with institutions from Italy, US, Germany, Russia, Poland and France, operates a 300-ton liquid-scintillator detector designed to observe and study low-energy solar neutrinos. The low background of the Borexino detector has been key to the detection of geo-neutrinos. Technologies developed by Borexino Collaborators have achieved very low background levels. The central core of the Borexino scintillator is now the lowest background detector available for these observations. The ultra-low background of Borexino was developed to make the first measurements of solar neutrinos below 1 MeV and has now produced this first, firm observation of geo-neutrinos.

Geo-neutrinos are anti-neutrinos produced in radioactive decays of naturally occurring Uranium, Thorium, Potassium, and Rubidium. Decays from these radioactive elements are believed to contribute a significant but unknown fraction of the heat generated inside our planet. The heat generates convective movements in the Earth's mantle that influence volcanic activity and tectonic plate movements inducing seismic activity, and the geo-dynamo that creates the Earth's magnetic field.
More above......

***
Links borrowed from here

Browsing experiments
 • auger (7 photos)
 • borexino (6 photos)
 • cobra (6 photos)
 • cresst (5 photos)
 • cryostem (2 photos)
 • cuore (5 photos)
 • cuoricino (3 photos)
 • dama (9 photos)
 • eastop (4 photos)
 • ermes (2 photos)
 • genius (3 photos)
 • gerda (1 photos)
 • gigs (3 photos)
 • gno (6 photos)
 • hdms (2 photos)
 • hmbb (1 photos)
 • icarus (19 photos)
 • lisa (1 photos)
 • luna (5 photos)
 • lvd (4 photos)
 • macro (4 photos)
 • mibeta (1 photos)
 • opera (26 photos)
 • tellus (1 photos)
 • underseis (8 photos)
 • vip (1 photos)
 • warp (10 photos)
 • xenon (4 photos)
 • zoo (3 photos)


*** 
See Also:

The Law of Octaves


Dmitri Ivanovich Mendeleev (also romanized Mendeleyev or Mendeleef; Russian: Дми́трий Ива́нович Менделе́ев About this sound listen ) (8 February [O.S. 27 January] 1834 – 2 February [O.S. 20 January] 1907), was a Russian chemist and inventor. He is credited as being the creator of the first version of the periodic table of elements. Using the table, he predicted the properties of elements yet to be discovered.

This post was inspired by the "Poll: Do you believe in extraterrestrial life?"

I was thinking about Hoyle and CNO and of course about Lee Smolin. As you get older it is harder for me to retain all that proceeded each discussion, so linking helps to refresh.

Carbon is the 15th most abundant element in the Earth's crust, and the fourth most abundant element in the universe by mass after hydrogen, helium, and oxygen. It is present in all known lifeforms, and in the human body carbon is the second most abundant element by mass (about 18.5%) after oxygen.[14] This abundance, together with the unique diversity of organic compounds and their unusual polymer-forming ability at the temperatures commonly encountered on Earth, make this element the chemical basis of all known life.See:Carbon

It was necessary to recall the links from one to the other, to show how one's perception "about Carbon was drawn" into the discussion about what life in the cosmos is based on.



I am partial to the allotropic expression on a physical scale by something that was understood as the Law of Octaves.

Carbon forms the backbone of biology for all life on Earth. Complex molecules are made up of carbon bonded with other elements, especially oxygen, hydrogen and nitrogen. It is these elements that living organisms need, among others, and carbon is able to bond with all of these because of its four valence electrons. Since no life has been observed that is not carbon-based, it is sometimes assumed in astrobiology that life elsewhere in the universe will also be carbon-based. This assumption is referred to by critics as carbon chauvinism, as it may be possible for life to form that is not based on carbon, even though it has never been observed.See:Carbon-based life


For sure each round of discussion on the topic leads too... and there are ideas with which one can ask.  I am in no way advocating anything here in terms of AP other then to remember discussions about this very poll question before.  

The triple alpha process is highly dependent on carbon-12 having a resonance with the same energy as helium-4 and beryllium-8 and before 1952 no such energy level was known. It was astrophysicist Fred Hoyle who used the fact that carbon-12 is so abundant in the universe (and that our existence depends upon it - the Anthropic Principle), as evidence for the existence of the carbon-12 resonance. Fred suggested the idea to nuclear physicist Willy Fowler, who conceded that it was possible that this energy level had been missed in previous work on carbon-12. After a brief undertaking by his research group, they discovered a resonance near to 7.65 Mev.See:Triple-alpha process

*** 
 

Mass spectrometers are analytical instruments that determine atomic and molecular masses with great accuracy. Low-pressure vapors of elements or molecules are hit by a beam of rapidly moving electrons. The collision knocks an electron off the sample atom or molecule, leaving it positively charged.
These newly-formed ions are accelerated out of the ionization chamber by an electric field. The speeds to which the ions can be accelerated by the electric field are determined by their masses. Lighter ions can go faster than heavier ones.


Ion's path bent by external magnetic field
Ion's path bent by external magnetic field
Courtesy: McREL
The beam of positively-charged ions generates a slight magnetic field that interacts with an externally-applied magnetic field. The net result is that the trajectory of a charged particle is curved to an extent that depends on its speed (determined by its mass). When the beam of a mixture of isotopes of different masses falls on a photographic plate, the different isotopes converge at different points, corresponding to the different radii of their semicircular paths.
The mathematical equation that describes this phenomenon is: m/e = H2 r2 /2V, where m is the mass of the ion, e is the charge of the ion, H is the magnetic field strength, r is the radius of the semicircle, and V is the accelerating potential.
***
See Also:

Friday, March 19, 2010

Neutrinoless Double Beta Decay

You don’t see what you’re seeing until you see it,” Dr. Thurston said, “but when you do see it, it lets you see many other things.Elusive Proof, Elusive Prover: A New Mathematical Mystery


The Enriched Xenon Observatory is an experiment in particle physics aiming to detect "neutrino-less double beta decay" using large amounts of xenon isotopically enriched in the isotope 136. A 200-kg detector using liquid Xe is currently being installed at the Waste Isolation Pilot Plant (WIPP) near Carlsbad, New Mexico. Many research and development efforts are underway for a ton-scale experiment, with the goal of probing new physics and the mass of the neutrino. The Enriched Xenon Observatory

***
Feynman diagram of neutrinoless double-beta decay, with two neutrons decaying to two protons. The only emitted products in this process are two electrons, which can only occur if the neutrino and antineutrino are the same particle (i.e. Majorana neutrinos) so the same neutrino can be emitted and absorbed within the nucleus. In conventional double-beta decay, two antineutrinos - one arising from each W vertex - are emitted from the nucleus, in addition to the two electrons. The detection of neutrinoless double-beta decay is thus a sensitive test of whether neutrinos are Majorana particles.


Neutrinoless double-beta decay experiments

Numerous experiments have been carried out to search for neutrinoless double-beta decay. Some recent and proposed future experiments include:

See:Direct Dark Matter Detection


 See Also: South Dakota's LUX will join the dark matter wars

Tuesday, March 16, 2010

Frogs, Foam and Fuel: UC Researchers Convert Solar Energy to Sugars



Photos By: Illustration by Megan Gundrum, fifth-year DAAP student

For decades, farmers have been trying to find ways to get more energy out of the sun.

In natural photosynthesis, plants take in solar energy and carbon dioxide and then convert it to oxygen and sugars. The oxygen is released to the air and the sugars are dispersed throughout the plant — like that sweet corn we look for in the summer. Unfortunately, the allocation of light energy into products we use is not as efficient as we would like. Now engineering researchers at the University of Cincinnati are doing something about that.
See:Frogs, Foam and Fuel: UC Researchers Convert Solar Energy to Sugars

***

I get very excited when I see ideas like this.


See:When It Comes to Photosynthesis, Plants Perform Quantum Computation
Plants soak up some of the 1017 joules of solar energy that bathe Earth each second, harvesting as much as 95 percent of it from the light they absorb. The transformation of sunlight into carbohydrates takes place in one million billionths of a second, preventing much of that energy from dissipating as heat. But exactly how plants manage this nearly instantaneous trick has remained elusive. Now biophysicists at the University of California, Berkeley, have shown that plants use the basic principle of quantum computing—the exploration of a multiplicity of different answers at the same time—to achieve near-perfect efficiency.

Biophysicist Gregory Engel and his colleagues cooled a green sulfur bacterium—Chlorobium tepidum, one of the oldest photosynthesizers on the planet—to 77 kelvins [–321 degrees Fahrenheit] and then pulsed it with extremely short bursts of laser light. By manipulating these pulses, the researchers could track the flow of energy through the bacterium's photosynthetic system. "We always thought of it as hopping through the system, the same way that you or I might run through a maze of bushes," Engel explains. "But, instead of coming to an intersection and going left or right, it can actually go in both directions at once and explore many different paths most efficiently."

In other words, plants are employing the basic principles of quantum mechanics to transfer energy from chromophore (photosynthetic molecule) to chromophore until it reaches the so-called reaction center where photosynthesis, as it is classically defined, takes place. The particles of energy are behaving like waves. "We see very strong evidence for a wavelike motion of energy through these photosynthetic complexes," Engel says. The results appear in the current issue of Nature.

QUANTUM CHLOROPHYLL: Sunlight triggers wave-like motion in green chlorophyll, embedded in a protein structure, depicted in gray here, that guides its function. GREGORY ENGEL

Employing this process allows the near-perfect efficiency of plants in harvesting energy from sunlight and is likely to be used by all of them, Engel says. It might also be copied usefully by researchers attempting to create artificial photosynthesis, such as that in photovoltaic cells for generating electricity. "This can be a much more efficient energy transfer than a classical hopping one," Engel says. "Exactly how to implement that is a very difficult question."

It also remains unclear exactly how a plant's structure permits this quantum effect to take place. "[The protein structure] of the plant has to be tuned to allow transfer among chromophores but not to allow transfers into [heat]," Engel says. "How that tuning works and how it is controlled, we don't know." Inside every spring leaf is a system capable of performing a speedy and efficient quantum computation, and therein lies the key to much of the energy on Earth.

***

Wednesday, March 03, 2010

Neutron interferometer

Lubos Motl:
You have completely misunderstood the neutron gravitational interference experiment. They showed that the force acting on the neutron is simply not negligible. Quite on the contrary, these interference experiments could measure and did measure the gravitational acceleration - and even the tidal forces - on the phase shift of the neutron's wave function. It's the very point of these experiments.

So whatever theory predicts that such forces are "negligible" is instantly falsified.
***
From Wikipedia
In physics, a neutron interferometer is an interferometer capable of diffracting neutrons, allowing the wave-like nature of neutrons, and other related phenomena, to be explored.
Interferometry inherently depends on the wave nature of the object. As pointed out by de Broglie in his PhD-thesis, particles, including neutrons, can behave like waves (the so called wave-particle duality, now explained in the general framework of quantum mechanics). The wave functions of the individual interferometer paths are created and recombined coherently which needs the application of dynamical theory of diffraction. Neutron interferometers are the counterpart of X-ray interferometers and are used to study quantities or benefits related to thermal neutron radiation.

Neutron interferometers are used to determine minute quantum-mechanical effects to the neutron wave, such as studies of the
  • Aharonov-Bohm effect
  • gravity acting on an elementary particle, the neutron
  • rotation of the earth acting on a quantum system
they can be applied for
Like X-ray interferometers, neutron interferometers are typically carved from a single large crystal of silicon, often 10 to 30 or more centimeters in diameter and 20 to 60 or more in length. Modern semiconductor technology allows large single-crystal silicon boules to be easily grown. Since the boule is a single crystal, the atoms in the boule are precisely aligned, to within small fractions of a nanometer or an angstrom, over the entire boule. The interferometer is created by carving away all but three slices of silicon, held in perfect alignment by a base. (image) Neutrons impinge on the first slice, where, by diffraction from the crystalline lattice, they separate into two beams. At the second slice, they are diffracted again, with two beams continuing on to the third slice. At the third slice, the beams recombine, interfering constructively or destructively, completing the interferometer. Without the precise, angstrom-level alignment of the three slices, the interference results would not be meaningful.

Only recently, a neutron interferometer for cold and ultracold neutrons was designed and successfully run. As neutron optical components in this case three artificial holographically produced, i.e., by means of a light optic two wave interference setup illuminating a photo-neutronrefractive polymer, gratings are employed.

References

V. F. Sears, Neutron Optics, Oxford University Press (1998).
H. Rauch and S. A. Werner, Neutron Interferometry, Clarendon Press, Oxford (2000).

***

On the Origin of Gravity and the Laws of Newton

Starting from first principles and general assumptions Newton's law of gravitation is shown to arise naturally and unavoidably in a theory in which space is emergent through a holographic scenario. Gravity is explained as an entropic force caused by changes in the information associated with the positions of material bodies. A relativistic generalization of the presented arguments directly leads to the Einstein equations. When space is emergent even Newton's law of inertia needs to be explained. The equivalence principle leads us to conclude that it is actually this law of inertia whose origin is entropic.

***


 


The Neutron Interferometry and Optics Facility (NIOF) located in the NIST Center for Neutron Research Guide Hall is one of the world's premier user facilities for neutron interferometry and related neutron optical measurements. A neutron interferometer (NI) splits, then recombines neutron waves. This gives the NI its unique ability to experimentally access the phase of neutron waves. Phase measurements are used to study the magnetic, nuclear, and structural properties of materials, as well fundamental questions in quantum physics. Related, innovative neutron optical techniques for use in condensed matter and materials science research are being developed.
 ***
 


Neutron Interferometer. A three blade neutron interferometer, machined from a single crystal silicon ingot is shown in two views. A monoenergetic neutron beam is split by the first blade and recombined in the third blade. If a sample is introduced in one of the paths, a phase difference in the wave function is produced, and interference between the recombined beams causes count rate shifts of opposite sign in the two detectors.
The Neutron Interferometer Facility in the Cold Neutron Guide Hall became operational in April 1994. It became available as a National User Facility in September 1996. Phase contrast of up to 88 percent and phase stability of better than five milliradians per day were observed. These performance indications are primarily the result of the advanced vibration isolation and environmental control systems. The interferometer operates inside a double walled enclosure, with the inner room built on a 40,000 kg slab which floats on pneumatic pads above an isolated foundation.



See AlsoGravity is Entropy is Gravity is...

Sunday, February 28, 2010

Trivium:Three Roads


 
Logic is the art of thinking; grammar, the art of inventing symbols and combining them to express thought; and rhetoric, the art of communicating thought from one mind to another, the adaptation of language to circumstance.Sister Miriam Joseph

 Painting by Cesare Maccari (1840-1919), Cicero Denounces Catiline.

In medieval universities, the trivium comprised the three subjects taught first: grammar, logic, and rhetoric. The word is a Latin term meaning “the three ways” or “the three roads” forming the foundation of a medieval liberal arts education. This study was preparatory for the quadrivium. The trivium is implicit in the De nuptiis of Martianus Capella, although the term was not used until the Carolingian era when it was coined in imitation of the earlier quadrivium.[1] It was later systematized in part by Petrus Ramus as an essential part of Ramism.


Formal grammar

A formal grammar (sometimes simply called a grammar) is a set of rules of a specific kind, for forming strings in a formal language. The rules describe how to form strings from the language's alphabet that are valid according to the language's syntax. A grammar does not describe the meaning of the strings or what can be done with them in whatever context —only their form.

Formal language theory, the discipline which studies formal grammars and languages, is a branch of applied mathematics. Its applications are found in theoretical computer science, theoretical linguistics, formal semantics, mathematical logic, and other areas.

A formal grammar is a set of rules for rewriting strings, along with a "start symbol" from which rewriting must start. Therefore, a grammar is usually thought of as a language generator. However, it can also sometimes be used as the basis for a "recognizer"—a function in computing that determines whether a given string belongs to the language or is grammatically incorrect. To describe such recognizers, formal language theory uses separate formalisms, known as automata theory. One of the interesting results of automata theory is that it is not possible to design a recognizer for certain formal languages.

Parsing is the process of recognizing an utterance (a string in natural languages) by breaking it down to a set of symbols and analyzing each one against the grammar of the language. Most languages have the meanings of their utterances structured according to their syntax—a practice known as compositional semantics. As a result, the first step to describing the meaning of an utterance in language is to break it down part by part and look at its analyzed form (known as its parse tree in computer science, and as its deep structure in generative grammar).
Logic

As a discipline, logic dates back to Aristotle, who established its fundamental place in philosophy. The study of logic is part of the classical trivium.

Averroes defined logic as "the tool for distinguishing between the true and the false"[4]; Richard Whately, '"the Science, as well as the Art, of reasoning"; and Frege, "the science of the most general laws of truth". The article Definitions of logic provides citations for these and other definitions.

Logic is often divided into two parts, inductive reasoning and deductive reasoning. The first is drawing general conclusions from specific examples, the second drawing logical conclusions from definitions and axioms. A similar dichotomy, used by Aristotle, is analysis and synthesis. Here the first takes an object of study and examines its component parts, the second considers how parts can be combined to form a whole.
Logic is also studied in argumentation theory.[5]