Saturday, December 30, 2006

N category and the Hydrogen spectrum


Picture of the 1913 Bohr model of the atom showing the Balmer transition from n=3 to n=2. The electronic orbitals (shown as dashed black circles) are drawn to scale, with 1 inch = 1 Angstrom; note that the radius of the orbital increases quadratically with n. The electron is shown in blue, the nucleus in green, and the photon in red. The frequency ν of the photon can be determined from Planck's constant h and the change in energy ΔE between the two orbitals. For the 3-2 Balmer transition depicted here, the wavelength of the emitted photon is 656 nm.
In atomic physics, the Bohr model depicts the atom as a small, positively charged nucleus surrounded by electrons that travel in circular orbits around the nucleus — similar in structure to the solar system, but with electrostatic forces providing attraction, rather than gravity.

Introduced by Niels Bohr in 1913, the model's key success was in explaining the Rydberg formula for the spectral emission lines of atomic hydrogen; while the Rydberg formula had been known experimentally, it did not gain a theoretical underpinning until the Bohr model was introduced.

The Bohr model is a primitive model of the hydrogen atom. As a theory, it can be derived as a first-order approximation of the hydrogen atom using the broader and much more accurate quantum mechanics, and thus may be considered to be an obsolete scientific theory. However, because of its simplicity, and its correct results for selected systems (see below for application), the Bohr model is still commonly taught to introduce students to quantum mechanics.


For one to picture events in the cosmos, it is important that the spectral understanding of the events as they reveal themselves. So you look at these beautiful pictures and information taken from them allow us to see the elemental considerations of let's say the blue giants demise. What was that blue giant made up of in term sof it's elemental structure

The quantum leaps are explained on the basis of Bohr's theory of atomic structure. From the Lyman series to the Brackett series, it can be seen that the energy applied forces the hydrogen electrons to a higher energy level by a quantum leap. They remain at this level very briefly and, after about 10-8s, they return to their initial or a lower level, emitting the excess energy in the form of photons (once again by a quantum leap).


Lyman series
Hydrogen atoms excited to luminescence emit characteristic spectra. On excitation, the electron of the hydrogen atom reaches a higher energy level. In this case, the electron is excited from the base state, with a principal quantum number of n = 1, to a level with a principal quantum number of n = 4. After an average dwell time of only about 10-8s, the electron returns to its initial state, releasing the excess energy in the form of a photon.
The various transitions result in characteristic spectral lines with frequencies which can be calculated by f=R( 1/n2 - 1/m2 ) R = Rydberg constant.
The lines of the Lyman series (n = 1) are located in the ultraviolet range of the spectrum. In this example, m can reach values of 2, 3 and 4 in succession.


Balmer series
Hydrogen atoms excited to luminescence emit characteristic spectra. On excitation, the electron of the hydrogen atom reaches a higher energy level. In this case, the electron is excited from the base state, with a principal quantum number of n = 1, to a level with a principal quantum number of n = 4. The Balmer series becomes visible if the electron first falls to an excited state with the principal quantum number of n = 2 before returning to its initial state.
The various transitions result in characteristic spectral lines with frequencies which can be calculated by f=R( 1/n2 - 1/m2 ) R = Rydberg constant.
The lines of the Balmer series (n = 2) are located in the visible range of the spectrum. In this example, m can reach values of 3, 4, 5, 6 and 7 in succession.


Paschen series
Hydrogen atoms excited to luminescence emit characteristic spectra. On excitation, the electron of the hydrogen atom reaches a higher energy level. In this case, the electron is excited from the base state, with a principal quantum number of n = 1, to a level with a principal quantum number of n = 7. The Paschen series becomes visible if the electron first falls to an excited state with the principal quantum number of n = 3 before returning to its initial state.
The various transitions result in characteristic spectral lines with frequencies which can be calculated by f=R( 1/n2 - 1/m2 ) R = Rydberg constant.
The lines of the Paschen series (n = 3) are located in the near infrared range of the spectrum. In this example, m can reach values of 4, 5, 6 and 7 in succession.


Brackett series
Hydrogen atoms excited to luminescence emit characteristic spectra. On excitation, the electron of the hydrogen atom reaches a higher energy level. In this case, the electron is excited from the base state, with a principal quantum number of n = 1, to a level with a principal quantum number of n = 8. The Brackett series becomes visible if the electron first falls to an excited state with the principal quantum number of n = 4 before returning to its initial state.
The lines of the Brackett series (n = 4) are located in the infrared range of the spectrum. In this example, m can reach values of 5, 6, 7 and 8 in succession.

Friday, December 29, 2006

Wolf-Rayet star

While I have started off with the definition of the Wolf-Rayet star, the post ends in understanding the aspects of gravity and it's affects, as we look at what has become of these Wolf-Rayet stars in their desimination of it's constituent properties.

Similar, "in my thinking" to the expansion of our universe?


Artist's impression of a Wolf-Rayet star
About 150 Wolf-Rayets are known in our own Milky Way Galaxy, about 100 are known in the Large Magellanic Cloud, while only 12 have been identified in the Small Magellanic Cloud. Wolf-Rayet stars were discovered spectroscopically in 1867 by the French astronomers Charles Wolf and Georges Rayet using visual spectrometery at Paris Observatory.


There are some thoughts manifesting about how one may have see this energy of the Blue giant. It's as if the examples of what began with great force can loose it's momentum and dissipate very quickly(cosmic winds that blow the dust to different places)?


Illustration of Cosmic Forces-Credit: NASA, ESA, and A. Feild (STScI)
Scientists using NASA's Hubble Space Telescope have discovered that dark energy is not a new constituent of space, but rather has been present for most of the universe's history. Dark energy is a mysterious repulsive force that causes the universe to expand at an increasing rate.


What if the Wolf-Rayet star does not produce the jets that are exemplified in the ideas which begin blackhole creation. Is this part of blackhole development somehow in it's demise, that we may see examples of the 150 Wolf-Rayets known in our own Milky Way as example of what they can become as blackholes, or not.

Quark to quark Distance and the Metric

If on such a grand scale how is it thoughts are held in my mind to microscopic proportions may not dominate as well within the periods of time the geometrics develop in the stars now known as Wolf-Rayet. So you use this cosmological model to exemplify micro perspective views in relation to high energy cosmological geometrics.



Plato:
"Lagrangian views" in relation may have been one result that comes quickly to my mind. Taking that chaldni plate and applying it to the universe today.


While I had in the previous post talked about how Lagrangian views could dominate "two aspects of the universe," it is not without linking the idea of what begins as a strong gravitational force to hold the universe together, that over time, as the universe became dominated by the dark energy that the speeding up of inflation could have become pronounced by discovering the holes created in the distances between the planets and their moons. Between galaxies.



I make fun above with the understanding of satellites travelling in our current universe in relation to planets and moons, as well as galaxies. To have taken this view down to WMAP proportions is just part of what I am trying to convey using very simplistic examples of how one may look at the universe, when gravity dominated the universe's expansion versus what has happened to the universe today in terms of speeding up.


LOOP-DE-LOOP. The Genesis spacecraft's superhighway path took it to the Earth-sun gravitational-equilibrium point L1, where it made five "halo" orbits before swinging around L2 and heading home.Ross


If the distances between galaxies have become greater, then what saids that that the ease with which the speeding up occurs is not without understanding that an equilibrium has been attained, from what was once dominate in gravity, to what becomes rapid expansion?

This book describes a revolutionary new approach to determining low energy routes for spacecraft and comets by exploiting regions in space where motion is very sensitive (or chaotic). It also represents an ideal introductory text to celestial mechanics, dynamical systems, and dynamical astronomy. Bringing together wide-ranging research by others with his own original work, much of it new or previously unpublished, Edward Belbruno argues that regions supporting chaotic motions, termed weak stability boundaries, can be estimated. Although controversial until quite recently, this method was in fact first applied in 1991, when Belbruno used a new route developed from this theory to get a stray Japanese satellite back on course to the moon. This application provided a major verification of his theory, representing the first application of chaos to space travel.

Since that time, the theory has been used in other space missions, and NASA is implementing new applications under Belbruno's direction. The use of invariant manifolds to find low energy orbits is another method here addressed. Recent work on estimating weak stability boundaries and related regions has also given mathematical insight into chaotic motion in the three-body problem. Belbruno further considers different capture and escape mechanisms, and resonance transitions.

Providing a rigorous theoretical framework that incorporates both recent developments such as Aubrey-Mather theory and established fundamentals like Kolmogorov-Arnold-Moser theory, this book represents an indispensable resource for graduate students and researchers in the disciplines concerned as well as practitioners in fields such as aerospace engineering.

Thursday, December 28, 2006

First Stars Behind the Scene

There are several recognized processes from the early universe that leave relic effects setting the stage for galaxy formation and evolution. We deal here with the first generarion of stars, primordial nucleosynthesis, the epoch of recombination, and the thermal history of various cosmic backgrounds.


Part of understanding the time line is first knowing where the Pregalactic Universe exists in that time line.

Plato:
So given the standard information one would have to postulate something different then what is currently classified?

A new Type III (what ever one shall attribute this to definition), versus Type I, Type IIa?


The idea is to place the distant measure in relation to what is assumed of TYPE I, TypeIIa. It assumes all these things, but has to been defined further, to be a Type III. That's the point of setting the values of where this measure can be taken from.

I wrote someplace else the thought generated above. It is nice that the world of scientists are not so arrogant in some places, while some have been willing to allow the speculation to continue. Even amidst their understanding, that I was less then the scientist that they are, yet recognizing, I am deeply motivated to understanding this strange world of cosmology and it's physics.

When I wrote this title above I was actually thinking of two scenarios that are challenging the way I am seeing it.


Credit: NASA/WMAP Science Team
WMAP has produced a new, more detailed picture of the infant universe. Colors indicate "warmer" (red) and "cooler" (blue) spots. The white bars show the "polarization" direction of the oldest light. This new information helps to pinpoint when the first stars formed and provides new clues about events that transpired in the first trillionth of a second of the universe.


First of these, was in terms of the time line and what we know of the WMAP demonstration given to us of that early universe. I of course inject some of what I know by past research to help the general public understand what is being demonstrated from another perspective.

This is what happens as you move through different scientists(Wayne Hu) thoughts to see the world in the way they may see it. This concept can be quite revealing sometimes giving a profound effect to moving the mind to consider the universe in new ways.



"Lagrangian views" in relation may have been one result that comes quickly to my mind. Taking that chaldni plate and applying it to the universe today.



Even though in the context of this post, we may see the universe in a "simple experiment" not just demonstrating the "early universe," but the universe in it's "gravitational effect" from that evolution to matter defined now.

The Time Line


Credit: NASA/WMAP Science Team
The expansion of the universe over most of its history has been relatively gradual. The notion that a rapid period "inflation" preceded the Big Bang expansion was first put forth 25 years ago. The new WMAP observations favor specific inflation scenarios over other long held ideas.


Looking to the "far left" of the image we see the place where the cosmic background is being demonstrated, while to the "far right" we see the satellite which has helped measure what we know of the early universe. So this "distant measure" has allowed us to understand what is behind the scene of what we know of cosmology today of events, galaxies and such.

Second, what comes to mind is the Massive Blue Star of 100 Solar masses that would have been further out in terms of the billions of years that we may of sought in terms of our measures. So this would be of value I would assume in relation to model perspective and measures?

So the distance measure has been defined then by understanding the location of the cosmic background and the place where the Blue giants will have unfolded in their demise, to the creation of blackholes.


The processes in the Universe after the Big Bang. The radio waves are much older than the light of galaxies. From the distortion of the images (curved lines) - caused by the gravitation of material between us and the light sources - it is possible to calculate and map the entire foreground mass.Image: Max Planck Institute of Astrophysics
We don't have to wait for the giant telescope to get unparalleled results from this technique, however. One of the most pressing issues in current physics is to gain a better understanding of the mysterious Dark Energy which currently drives the accelerated expansion of the Universe. Metcalf and White show that mass maps of a large fraction of the sky made with an instrument like SKA could measure the properties of Dark Energy more precisely than any previously suggested method, more than 10 times as accurately as mass maps of similar size based on gravitational distortions of the optical images of galaxies.

Wednesday, December 27, 2006

The Geometrics Behind the Supernova and it's History



It is not always easy for people to see what lies behind the wonderful beauty of images that we take from the satellite measures of space, and it's dynamical events illustrated in Cassiopeia A. There before you is this majestic image of beauty, as we wonder about it's dynamics.


These Spitzer Space Telescope images, taken one year apart, show the supernova remnant Cassiopeia A (yellow ball) and surrounding clouds of dust (reddish orange). The pictures illustrate that a blast of light from Cassiopeia A is waltzing outward through the dusty skies. This dance, called an "infrared echo," began when the remnant erupted about 50 years ago. Image credit: NASA/JPL-Caltech/Univ. of Ariz.
An enormous light echo etched in the sky by a fitful dead star was spotted by the infrared eyes of NASA's Spitzer Space Telescope.

The surprising finding indicates Cassiopeia A, the remnant of a star that died in a supernova explosion 325 years ago, is not resting peacefully. Instead, this dead star likely shot out at least one burst of energy as recently as 50 years ago.



How is it such information arrives to us, and we would have to consider the impulse's behind such geometrical explanations. Which we are lucky to see in other ways. So, of course we needed to see the impulse as dynamically driven by the geometrical inclinations of that collapse, and all it's information spread outward by the description in images painted.


Credit: Weiqun Zhang and Stan Woosley
This image is from a computer simulation of the beginning of a gamma-ray burst. Here we see the jet 9 seconds after its creation at the center of a Wolf Rayet star by the newly formed, accreting black hole within. The jet is now just erupting through the surface of the Wolf Rayet star, which has a radius comparable to that of the sun. Blue represents regions of low mass concentration, red is denser, and yellow denser still. Note the blue and red striations behind the head of the jet. These are bounded by internal shocks.


If I had approached you early on and suggested that you look at "bubble geometrodynamics" would it have seemed so real that I would have presented a experiment to you, that would help "by analogies" to see what is happening? Might I then be called the one spreading such information that it was not of value to scientists to consider, that I was seeing in ways that I can only now give to you as example? What science has done so far with using the physics with cosmological views?


Image Credit: NASA/JPL-Caltech/STScI/CXC/SAO
This stunning false-color picture shows off the many sides of the supernova remnant Cassiopeia A, which is made up of images taken by three of NASA's Great Observatories, using three different wavebands of light. Infrared data from the Spitzer Space Telescope are colored red; visible data from the Hubble Space Telescope are yellow; and X-ray data from the Chandra X-ray Observatory are green and blue.

Located 10,000 light-years away in the northern constellation Cassiopeia, Cassiopeia A is the remnant of a once massive star that died in a violent supernova explosion 325 years ago. It consists of a dead star, called a neutron star, and a surrounding shell of material that was blasted off as the star died. The neutron star can be seen in the Chandra data as a sharp turquoise dot in the center of the shimmering shell.


In this image above we learn of what manifests in "jet production lines," and such examples are beautiful examples to me of what the geometrics are doing. You needed some way to be able to explain this within context of the universe's incidences "as events." We say this action is one with which we may speak to this "corner of the universe." Yet it is very dynamical in it's expression as we see it multiplied from various perspectives.


The structure of Model J32 as the jet nears the surface 7820 seconds after core collapse.


So by experiment(?) I saw such relations, but what use such analogies if they are laid waste to speculation that what was initiated such ideas had been the inclination of geometrics detailed as underlying the basis of all expression as an example of some non euclidean views of Riemann perspectives leading shapes and dynamics of our universe by comparison within the local actions of stars and galaxies?

Gamma Rays?



So we get this information in one way or another and it was from such geometrical impulse that such examples are spread throughout the universe in ways that were not understood to well.


X-ray image of the gamma-ray burst GRB 060614 taken by the XRT instrument on Swift. The burst glowed in X-ray light for more than a week following the gamma-ray burst. This so-called "afterglow" gave an accurate position of the burst on the sky and enabled the deep optical observations made by ground-based observatories and the Hubble Space Telescope. Credit: NASA/Swift Team
A year ago scientists thought they had figured out the nature of gamma-ray bursts. They signal the birth of black holes and traditionally, fall into one of two categories: long or short. A newly discovered hybrid burst has properties of both known classes of gamma-ray bursts yet possesses features that remain unexplained.

The long bursts are those that last more than two seconds. It is believed that they are ejected by massive stars at the furthest edge of the universe as they collapse to form black holes.


So looking back to this timeline it is important to locate the ideas spread out before us. Have "some place" inclusive in the reality of that distance from the origins of the stars of our earliest times. 13.7 billions years imagine!


Fig. 1: Sketchy supernova classification scheme
A supernova is the most luminous event known. Its luminosity matches those of whole galaxies. The name derives from the works of Walter Baade and Fritz Zwicky who studied supernovae intensively in the early 1930s and used the term supernova therein.
Nowadays supernova is a collective term for different classes of objects, that exhibit a sudden rise in luminosity that drops again on a timescale of weeks.
Those objects are subdivided into two classes, supernovae of type I or II (SNe I and SNe II). The distinguishing feature is the absence or the presence of spectral lines of hydrogen. SNe I show no such lines as SNe II do. The class of SNe I is further subdivided in the classes a, b and c. This time the distinguishing feature are spectral features of helium and silicon. SN Ia show silicon features, SN Ib show helium but no silicon features and SN Ic show both no silicon and no helium spectral features.
The class of SN II is further subdivided in two classes. Those are distinguished by the decline of the lightcurve. Those SN II that show a linear decline are named SN II-L and those that pass through a plateau-phase are referred to as SN II-P.



So given the standard information one would have to postulate something different then what is currently classified?

A new Type III (what ever one shall attribute this to definition, versus Type I, Type IIa?


ssc2006-22b: Brief History of the Universe
Credit: NASA/JPL-Caltech/A. Kashlinsky (GSFC)
This artist's timeline chronicles the history of the universe, from its explosive beginning to its mature, present-day state.

Our universe began in a tremendous explosion known as the Big Bang about 13.7 billion years ago (left side of strip). Observations by NASA's Cosmic Background Explorer and Wilkinson Anisotropy Microwave Probe revealed microwave light from this very early epoch, about 400,000 years after the Big Bang, providing strong evidence that our universe did blast into existence. Results from the Cosmic Background Explorer were honored with the 2006 Nobel Prize for Physics.

A period of darkness ensued, until about a few hundred million years later, when the first objects flooded the universe with light. This first light is believed to have been captured in data from NASA's Spitzer Space Telescope. The light detected by Spitzer would have originated as visible and ultraviolet light, then stretched, or redshifted, to lower-energy infrared wavelengths during its long voyage to reach us across expanding space. The light detected by the Cosmic Background Explorer and the Wilkinson Anisotropy Microwave Probe from our very young universe traveled farther to reach us, and stretched to even lower-energy microwave wavelengths.

Astronomers do not know if the very first objects were either stars or quasars. The first stars, called Population III stars (our star is a Population I star), were much bigger and brighter than any in our nearby universe, with masses about 1,000 times that of our sun. These stars first grouped together into mini-galaxies. By about a few billion years after the Big Bang, the mini-galaxies had merged to form mature galaxies, including spiral galaxies like our own Milky Way. The first quasars ultimately became the centers of powerful galaxies that are more common in the distant universe.

NASA's Hubble Space Telescope has captured stunning pictures of earlier galaxies, as far back as ten billion light-years away.


Would sort of set up the challenge?

Thursday, December 21, 2006

Hubble Finds Evidence for Dark Energy in the Young Universe



I had to go back to the article for some further reading.


These snapshots, taken by NASA's Hubble Space Telescope, reveal five supernovae, or exploding stars, and their host galaxies.

The arrows in the top row of images point to the supernovae. The bottom row shows the host galaxies before or after the stars exploded. The supernovae exploded between 3.5 and 10 billion years ago.

Astronomers used the supernovae to measure the expansion rate of the universe and determine how the expansion rate is affected by the repulsive push of dark energy, a mysterious energy force that pervades space. Supernovae provide reliable measurements because their intrinsic brightness is well understood. They are therefore reliable distance markers, allowing astronomers to determine how far away they are from Earth.

Pinpointing supernovae in the faraway universe is similar to watching fireflies in your back yard. All fireflies glow with about the same brightness. So, you can judge how the fireflies are distributed in your back yard by noting their comparative faintness or brightness, depending on their distance from you.

Only Hubble can measure these supernovae because they are too distant, and therefore too faint, to be studied by the largest ground-based telescopes.

These Hubble observations show for the first time that dark energy has been a present force for most of the universe's history. A spectral analysis also shows that the supernovae used to measure the universe's expansion rate today look remarkably similar to those that exploded nine billion years ago and are just now seen by Hubble.

These latest results are based on an analysis of the 24 most distant known supernovae, most of them discovered within the last three years by the Higher-z SN Search Team. The images were taken between 2003 and 2005 with Hubble's Advanced Camera for Surveys.



Illustration of Cosmic Forces-Credit: NASA, ESA, and A. Feild (STScI)
Scientists using NASA's Hubble Space Telescope have discovered that dark energy is not a new constituent of space, but rather has been present for most of the universe's history. Dark energy is a mysterious repulsive force that causes the universe to expand at an increasing rate.

Investigators used Hubble to find that dark energy was already boosting the expansion rate of the universe as long as nine billion years ago. This picture of dark energy is consistent with Albert Einstein's prediction of nearly a century ago that a repulsive form of gravity emanates from empty space.

Data from Hubble provides supporting evidence that help astrophysicists to understand the nature of dark energy. This will allow scientists to begin ruling out some competing explanations that predict that the strength of dark energy changes over time.

Researchers also have found that the class of ancient exploding stars, or supernovae, used to measure the expansion of space today look remarkably similar to those that exploded nine billion years ago and are just now being seen by Hubble. This important finding gives additional credibility to the use of these supernovae for tracking the cosmic expansion over most of the universe's lifetime.

"Although dark energy accounts for more than 70 percent of the energy of the universe, we know very little about it, so each clue is precious," said Adam Riess, of the Space Telescope Science Institute and Johns Hopkins University in Baltimore. Riess led one of the first studies to reveal the presence of dark energy in 1998 and is the leader of the current Hubble study. "Our latest clue is that the stuff we call dark energy was relatively weak, but starting to make its presence felt nine billion years ago."

To study the behavior of dark energy of long ago, Hubble had to peer far across the universe and back into time to detect supernovae. Supernovae can be used to trace the universe's expansion. This is analogous to seeing fireflies on a summer night. Fireflies glow with about the same brightness, so you can judge how they are distributed in the backyard by their comparative faintness or brightness, depending on their distance from you. Only Hubble can measure these ancient supernovae because they are too distant, and therefore too faint, to be studied by the largest ground-based telescopes.

Einstein first conceived of the notion of a repulsive force in space in his attempt to balance the universe against the inward pull of its own gravity, which he thought would ultimately cause the universe to implode.

His "cosmological constant" remained a curious hypothesis until 1998, when Riess and the members of the High-z Supernova Team and the Supernova Cosmology Project used ground-based telescopes and Hubble to detect the acceleration of the expansion of space from observations of distant supernovae. Astrophysicists came to the realization that Einstein may have been right after all: there really was a repulsive form of gravity in space that was soon after dubbed "dark energy."

Over the past eight years astrophysicists have been trying to uncover two of dark energy's most fundamental properties: its strength and its permanence. These new observations reveal that dark energy was present and obstructing the gravitational pull of the matter in the universe even before it began to win this cosmic "tug of war."

Previous Hubble observations of the most distant supernovae known revealed that the early universe was dominated by matter whose gravity was slowing down the universe's expansion rate, like a ball rolling up a slight incline. The observations also confirmed that the expansion rate of the cosmos began speeding up about five to six billion years ago. That is when astronomers believe that dark energy's repulsive force overtook gravity's attractive grip.

The latest results are based on an analysis of the 24 most distant supernovae known, most found within the last two years.

By measuring the universe's relative size over time, astrophysicists have tracked the universe's growth spurts, much as a parent may witness the growth spurts of a child by tracking changes in height on a doorframe. Distant supernovae provide the doorframe markings read by Hubble. "After we subtract the gravity from the known matter in the universe, we can see the dark energy pushing to get out," said Lou Strolger, astronomer and Hubble science team member at Western Kentucky University in Bowling Green, Ky. Further observations are presently underway with Hubble by Riess and his team which should continue to offer new clues to the nature of dark energy.




Credit: NASA, ESA, and A. Feild (STScI)

Tuesday, December 19, 2006

Cosmic ray spallation


As this NASA chart indicates, 70 percent or more of the universe consists of dark energy, about which we know next to nothing
Other explanations of dark energy, called "quintessence," originate from theoretical high-energy physics. In addition to baryons, photons, neutrinos, and cold dark matter, quintessence posits a fifth kind of matter (hence the name), a sort of universe-filling fluid that acts like it has negative gravitational mass. The new constraints on cosmological parameters imposed by the HST supernova data, however, strongly discourage at least the simplest models of quintessence.


Of course my mind is thinking about the cosmic triangle of an event in the cosmos. So I am wondering what is causing the "negative pressure" as "dark energy," and why this has caused the universe to speed up.


SNAP-Supernova / Acceleration Probe-Studying the Dark Energy of the Universe
The discovery by the Supernova Cosmology Project (SCP) and the High-Z Supernova team that the expansion of the universe is accelerating poses an exciting mystery — for if the universe were governed by gravitational attraction, its rate of expansion would be slowing. Acceleration requires a strange “dark energy’ opposing this gravity. Is this Einstein’s cosmological constant, or more exotic new physics? Whatever the explanation, it will lead to new discoveries in astrophysics, particle physics, and gravitation.


By defining the context of particle collisions it was evident that such a place where such a fluid could have dominated by such energy in stars, are always interesting as to what is ejected from those same stars. What do those stars provide for the expression of this universe while we are cognoscente of the "arrow of time" explanation.


This diagram reveals changes in the rate of expansion since the universe's birth 15 billion years ago. The more shallow the curve, the faster the rate of expansion.


So of course these thoughts are shared by the perspective of educators to help us along. But if one did not understand the nature of the physical attributes of superfluids, how would one know to think of the relativistic conditions that high energy provides for us?


NASA/WMAP Scientific Team: Expanding Universe



So recognizing where these conditions are evident would be one way in which we might think about what is causing a negative pressure in the cosmos.

Given the assumption that the matter in the universe is homogeneous and isotropic (The Cosmological Principle) it can be shown that the corresponding distortion of space-time (due to the gravitational effects of this matter) can only have one of three forms, as shown schematically in the picture at left. It can be "positively" curved like the surface of a ball and finite in extent; it can be "negatively" curved like a saddle and infinite in extent; or it can be "flat" and infinite in extent - our "ordinary" conception of space. A key limitation of the picture shown here is that we can only portray the curvature of a 2-dimensional plane of an actual 3-dimensional space! Note that in a closed universe you could start a journey off in one direction and, if allowed enough time, ultimately return to your starting point; in an infinite universe, you would never return.


Of course it is difficult for me to understand this process, but I am certainly trying. If one had found that in the relativistic conditions of high energy scenarios a "similarity to a flattening out" associated with an accelerating universe what would this say about information travelling from the "origins of our universe" quite freely. How would this effect dark energy?

In physics, a perfect fluid is a fluid that can be completely characterized by its rest frame energy density ρ and isotropic pressure p.

Real fluids are "sticky" and contain (and conduct) heat. Perfect fluids are idealized models in which these possibilities are neglected. Specifically, perfect fluids have no shear stresses, viscosity, or heat conduction.

In tensor notation, the energy-momentum tensor of a perfect fluid can be written in the form

[tex] T^{\mu\nu}=(\rho+p)\, U^\mu U^\nu + P\, \eta^{\mu\nu}\,[/tex]



where U is the velocity vector field of the fluid and where ημν is the metric tensor of Minkowski spacetime.

Perfect fluids admit a Lagrangian formulation, which allows the techniques used in field theory to be applied to fluids. In particular, this enables us to quantize perfect fluid models. This Lagrangian formulation can be generalized, but unfortunately, heat conduction and anisotropic stresses cannot be treated in these generalized formulations.

Perfect fluids are often used in general relativity to model idealized distributions of matter, such as in the interior of a star.


So events in the cosmos ejected the particles, what geometrical natures embued such actions, to have these particle out in space interacting with other forms of matter to create conditions that would seem conducive to me, for that negative pressure?

Cosmic ray spallation is a form of naturally occurring nuclear fission and nucleosynthesis. It refers to the formation of elements from the impact of cosmic rays on an object. Cosmic rays are energetic particles outside of Earth ranging from a stray electron to gamma rays. These cause spallation when a fast moving particle, usually a proton, part of a cosmic ray impacts matter, including other cosmic rays. The result of the collision is the expulsion of large members of nucleons (protons and neutrons) from the object hit. This process goes on not only in deep space, but in our upper atmosphere due to the impact of cosmic rays.

Cosmic ray spallation produces some light elements such as lithium and boron. This process was discovered somewhat by accident during the 1970s. Models of big bang nucleosynthesis suggested that the amount of deuterium was too large to be consistent with the expansion rate of the universe and there was therefore great interest in processes that could generate deuterium after the big bang.

Cosmic ray spallation was investigated as a possible process to generate deuterium. As it turned out, spallation could not generate much deuterium, and the excess deuterium in the universe could be explained by assuming the existence of non-baryonic dark matter. However, studies of spallation showed that it could generate lithium and boron. Isotopes of aluminum, beryllium, carbon(carbon-14), chlorine, iodine and neon, are also formed through cosmic ray spallation.



Talk about getting tongue tied, can you imagine, "these fluctuations can generate their own big bangs in tiny areas of the universe." Read on.


Photo credit: Lloyd DeGrane/University of Chicago News Office
Carroll and Chen’s scenario of infinite entropy is inspired by the finding in 1998 that the universe will expand forever because of a mysterious force called “dark energy.” Under these conditions, the natural configuration of the universe is one that is almost empty. “In our current universe, the entropy is growing and the universe is expanding and becoming emptier,” Carroll said.

But even empty space has faint traces of energy that fluctuate on the subatomic scale. As suggested previously by Jaume Garriga of Universitat Autonoma de Barcelona and Alexander Vilenkin of Tufts University, these fluctuations can generate their own big bangs in tiny areas of the universe, widely separated in time and space. Carroll and Chen extend this idea in dramatic fashion, suggesting that inflation could start “in reverse” in the distant past of our universe, so that time could appear to run backwards (from our perspective) to observers far in our past.

Monday, December 18, 2006

Gottfried Wilhelm von Leibniz

This is a historical reference as well as leading to a conclusion I won't say it for you just that I present the idea, "written word," and then you decide what that message is. You might have thought it disjointed, but it's really not, as you move through it.


Internet Philosphy-Gottfried Wilhelm Leibniz (1646-1716) Metaphysics


There are reasons why this article is being put up, and again, developing a little history to the "line up Lee Smolin prepared" is an important step in discerning why he may have gone down a certain route for comparative relations in terms of "against symmetry."


Click on link Against symmetry (Paris, June 06)

I have no one telling me this, just that any argument had to have it's "foundational logic of approach" and learning to interpret why someone did something, is sometimes just as important as the science they currently pursued, or adopted, in light of other models and methods. It does not necessarily make them right. Just that they are delving in model apprehension and devising the reasons why the model they choose to use, "is" the desired one, from their current philosophical development and understanding.

So they have to present their logic.

The Identity of Indiscernibles

The Identity of Indiscernibles (hereafter called the Principle) is usually formulated as follows: if, for every property F, object x has F if and only if object y has F, then x is identical to y. Or in the notation of symbolic logic:

F(Fx ↔ Fy) → x=y

This formulation of the Principle is equivalent to the Dissimilarity of the Diverse as McTaggart called it, namely: if x and y are distinct then there is at least one property that x has and y does not, or vice versa.

The converse of the Principle, x=y → ∀F(Fx ↔ Fy), is called the Indiscernibility of Identicals. Sometimes the conjunction of both principles, rather than the Principle by itself, is known as Leibniz's Law.


It is almost if the computerize world is to be developed further, "this logic" had to be based on some philosophical approach? Had to be derived from some developmental model beyond the scope of "the approach to quantum gravity" that it had it's basis designed in the area of research, a university could be exploiting itself?


In 1671 Gottfried Wilhelm von Leibniz (1646-1716) invented a calculating machine which was a major advance in mechanical calculating. The Leibniz calculator incorporated a new mechanical feature, the stepped drum — a cylinder bearing nine teeth of different lengths which increase in equal amounts around the drum. Although the Leibniz calculator was not developed for commercial production, the stepped drum principle survived for 300 years and was used in many later calculating systems.


This is not to say the developmental program disavows current research in all areas to be considered. Just that it's approach is based on "some method" that is not easily discernible even to the vast array of scientists current working in so many research fields.

Why Quantum Computers?

On the atomic scale matter obeys the rules of quantum mechanics, which are quite different from the classical rules that determine the properties of conventional logic gates. So if computers are to become smaller in the future, new, quantum technology must replace or supplement what we have now. The point is, however, that quantum technology can offer much more than cramming more and more bits to silicon and multiplying the clock--speed of microprocessors. It can support entirely new kind of computation with qualitatively new algorithms based on quantum principles!


Increasing complexity makes it very hard to describe complex systems and imagine if your were going from the top down, what constituent descriptors of reality we would have to manufacture, if we wanted to speak about all those forms and the complexity that makes up these forms?

Moore's Law

Moore's law is the empirical observation that the complexity of integrated circuits, with respect to minimum component cost, doubles every 24 months[1].

Friday, December 15, 2006

Johannes Kepler: The Birth of the Universe

I measured the skies, now the shadows I measure,
Sky-bound was the mind, earth-bound the body rests
Kepler's epitaph for his own tombstone


I always like to go back as well and learn the historical, for it seems to pave the way for how our good scientists of the day, use these times to begin their talks.

From the outset, then, symmetry was closely related to harmony, beauty, and unity, and this was to prove decisive for its role in theories of nature. In Plato's Timaeus, for example, the regular polyhedra are afforded a central place in the doctrine of natural elements for the proportions they contain and the beauty of their forms: fire has the form of the regular tetrahedron, earth the form of the cube, air the form of the regular octahedron, water the form of the regular icosahedron, while the regular dodecahedron is used for the form of the entire universe. The history of science provides another paradigmatic example of the use of these figures as basic ingredients in physical description: Kepler's 1596 Mysterium Cosmographicum presents a planetary architecture grounded on the five regular solids.


Perhaps on an "asymmetrical recognition" of what becomes the "matter distinctions" of form, from "another world perspective" to what beauty and harmony mean and housed within the definitions of symmetry.

So while you may have been fast track by Lee Smolin in his lecture talk in Paris of 2006, think carefully about what the Platonic tradition means, and what is revealed of the "asymmetrical/entropically challenged views developed from the high energy sector.


Johannes Kepler (December 27, 1571 – November 15, 1630)
For instance, Kepler was explicit about the intellectual safeguards that, in his view, the Christian faith provided for scientific speculation. In connection with the apriorism of the world view of antiquity (a good example is the Platonic dictum Ex nihilo nihil fit—nothing is made from nothing), he wrote: "Christian religion has put up some fences around false speculation in order that error may not rush headlong" (Introduction to Book IV of Epitome astronomae copernicanae, c1620, in Werke Vol. VII p. 254).


So even though Platonic contrast the Pythagorean views, Plato has an idea about what existed before all things manifested. So to think such solids could have made their way into the various forms, what were these descriptions, if not for the very idea of the birth of the universe of Kepler's time?


Kepler's Platonic solid model of the Solar system from Mysterium Cosmographicum (1596)


So in speaking to the information based on symmetries how could one have formed their perspectve and then lined up one line of thought with another?

Philosophically, permutation symmetry has given rise to two main sorts of questions. On the one side, seen as a condition of physical indistinguishability of identical particles (i.e. particles of the same kind in the same atomic system), it has motivated a rich debate about the significance of the notions of identity, individuality, and indistinguishability in the quantum domain. Does it mean that the quantum particles are not individuals? Does the existence of entities which are physically indistinguishable although “numerically distinct” (the so-called problem of identical particles) imply that the Leibniz's Principle of the Identity of Indiscernibles should be regarded as violated in quantum physics? On the other side, what is the theoretical and empirical status of this symmetry principle? Should it be considered as an axiom of quantum mechanics or should it be taken as justified empirically? It is currently taken to explain the nature of fermionic and bosonic quantum statistics, but why do there appear to be only bosons and fermions in the world when the permutation symmetry group allows the possibility of many more types? French and Rickles (2003) offers an eccellent and updated overview of the above and related issues.

Thursday, December 14, 2006

Against Symmetry

The term “symmetry” derives from the Greek words sun (meaning ‘with’ or ‘together’) and metron (‘measure’), yielding summetria, and originally indicated a relation of commensurability (such is the meaning codified in Euclid's Elements for example). It quickly acquired a further, more general, meaning: that of a proportion relation, grounded on (integer) numbers, and with the function of harmonizing the different elements into a unitary whole. From the outset, then, symmetry was closely related to harmony, beauty, and unity, and this was to prove decisive for its role in theories of nature. In Plato's Timaeus, for example, the regular polyhedra are afforded a central place in the doctrine of natural elements for the proportions they contain and the beauty of their forms: fire has the form of the regular tetrahedron, earth the form of the cube, air the form of the regular octahedron, water the form of the regular icosahedron, while the regular dodecahedron is used for the form of the entire universe. The history of science provides another paradigmatic example of the use of these figures as basic ingredients in physical description: Kepler's 1596 Mysterium Cosmographicum presents a planetary architecture grounded on the five regular solids.





The basic difference that I see is the way in which Lee Smolin adopts his views of what science is in relation too, "Two traditions in the search for fundamental Physics."

It is strange indeed to see perfection of Lee Smolin's comparison and having a look further down we understand the opening basis of his philosophical thoughts in regards to the title "against symmetry?"

Some reviews on the "Trouble With Physics," by Lee Smolin

  • Seed Magazine, August 2006
  • Time magazine August 21, 2006
  • Discover Magazine, September 2006 &
  • Scientific American, September 2006
  • Wired September 2006:15 :
  • The Economist, Sept 14, 2006
  • The New York Times Book review, Sep 17, 2006 by Tom Siegfried
  • The Boston Globe, Sept 17, 2006
  • USA Today, Sept 19, 2006
  • The New York Sun, by Michael Shermer, Sept 27, 2006
  • The New Yorker,  by Jim Holt Sept 25,2006
  • The LA Times, by K C Cole, Oct 8, 2006
  • Nature,
  • by George Ellis (Nature 44, 482, 5 Oct. 2006)
  • San Fransisco Chronicle , by Keay Davidson, Oct 13, 2006
  • Dallas Morning News, by FRED BORTZ, Oct 15, 2006
  • Toronto Star, by PETER CALAMAI, Oct 15, 2006


  • But before I begin in that direction I wanted people to understand something that is held in the mind of the "condense matter theorist." In terms of the building blocks of nature. This is important basis of understanding, that any building block could emergent from anything, we had to identify where this symmetry existed, before it manifested in the "matter states of reality."

    Everyone knows that human societies organize themselves. But it is also true that nature organizes itself, and that the principles by which it does this is what modern science, and especially modern physics, is all about. The purpose of my talk today is to explain this idea.


    So it is important to understand what is emergent and what exists in the "theory of everything" if it did not consider the context of symmetry? AS a layman trying to get underneath the thinking process of any book development, it is important to me.

    Symmetry considerations dominate modern fundamental physics, both in quantum theory and in relativity. Philosophers are now beginning to devote increasing attention to such issues as the significance of gauge symmetry, quantum particle identity in the light of permutation symmetry, how to make sense of parity violation, the role of symmetry breaking, the empirical status of symmetry principles, and so forth. These issues relate directly to traditional problems in the philosophy of science, including the status of the laws of nature, the relationships between mathematics, physical theory, and the world, and the extent to which mathematics dictates physics.


    The idea here then is to find super strings place within context of the evolving universe, in terms of, "the microseconds" and not the "first three minutes" of Steven Weinberg.

    So it is important to see the context with which this discussion is taking place, in terms of the high energy and from that state of existence to what entropically manifests into the universe now.

    Confronting A Position Adopted By Lee Smolin


    A sphere with three handles (and three holes), i.e., a genus-3 torus.

    This is only "one point of contention" that was being addressed at Clifford Johnson's Asymptotia.

    Jacques Distler :

    This is false. The proof of finiteness, to all orders, is in quite solid shape. Explicit formulæ are currently known only up to 3-loop order, and the methods used to write down those formulæ clearly don’t generalize beyond 3 loops.

    What’s certainly not clear (since you asked a very technical question, you will forgive me if my response is rather technical) is that, beyond 3 loops, the superstring measure over supermoduli space can be “pushed forward” to a measure over the moduli space of ordinary Riemann surfaces. It was a nontrivial (and, to many of us, somewhat surprising) result of d’Hoker and Phong that this does hold true at genus-2 and -3.


    There is no doubt that the "timeliness of statements" can further define, support or not, problems that are being discussed. I don't mind being deleted on the point of the post above, because our good scientist's are getting into the heat of things. I am glad Arun stepped up to the plate.

    Part of finally coming to some head on debate, was seeing how Peter Woit along with Lee Smolin were being challlenged for their views, while there had been this ground swell created against a model that was developed, like Loop quantum gravity was developed. One of the two traditions in search for the fundamental physics. Loop qunatum Gravity and String theory(must make sure there is the modification to M theory?) Shall this be included?


    Click on link Against symmetry (Paris, June 06)

    But as they are having this conversation, it is this openness that they have given of themselves that we learn of the intricacies of the basis of arguments, so the public is better informed as to what follows and what has to take place.


    Against symmetry (Paris, June 06)

    So while this issue is much more complex then just the exchange there, I have not forgotten what it is all about. Or why one may move from a certain position after they have summarize the views they had accumulated with regards to the subject of String/M theory as a model that has out lived it's usefulness, in terms of not providing a experimental frame work around it.

    Wednesday, December 13, 2006

    Visual Abstraction to Equations

    Sylvester's models lay hidden away for a long time, but recently the Mathematical Institute received a donation to rescue some of them. Four of these were carefully restored by Catherine Kimber of the Ashmolean Museum and now sit in an illuminated glass cabinet in the Institute Common Room.


    Some of you might have noticed the reference to the Ashmolean Museum?


    Photo by Graham Challifour. Reproduced from Critchlow, 1979, p. 132.


    It seems only the good scientist John Baez had epitomes the construction of the Platonic solids? A revision then, of the "time line of history" and the correction he himself had to make? Let's not be to arrogant to know that once we understand more and look at "the anomalies" it forces us to revise our assessments.

    The Art form

    I relayed this image and quote below on Clifford's site to encourage the thinking of young people into an art form that is truly amazing to me. Yes I get excited about it after having learnt of Gauss and Reimann's exceptional abilities to move into the non euclidean world.

    Some think me a crackpot here? If you did not follow the history then how would you know to also include the "physics of approach," as well? Also, some might ask what use "this ability to see the visual abstraction" and I think this art form is in a way destined, to what was kept in glass cabinets and such, even while the glass cabinet in analogy is held in the brain/space of them) who have developed such artistic abilities.

    It's as if you move past the layers of the evolution of the human being(brain casings) and it evolution and the field that surrounds them. Having accomplished the intellect( your equations and such), has now moved into the world of imagery. Closet to this is the emotive field which circumvents our perspective on the greater potential of the world in the amazing thought forms of imagery. This move outward, varies for each of us from time to time. Some who are focused in which ever area can move beyond them. This paragraph just written is what would be considered crackpot(I dislike that word)because of the long years of research I had gone through to arrive at this point.

    Of course, those views above are different.

    Mapping



    Is it illusionary or delusional, and having looked at the Clebsch's Diagonal Surface below, how is it that "abstraction" written?



    The enthusiasm that characterized such collections was captured by Francis Bacon [1, p. 247], who ironically advised "learned gentlemen" of the era to assemble within "a small compass a model of the universal made private", building

    ... a goodly, huge cabinet, wherein whatsoever the hand of man by exquisite art or engine has made rare in stuff, form or motion; whatsoever singularity, chance, and the shuffle of things hath produced; whatsoever Nature has wrought in things that want life and may be kept; shall be sorted and included.


    There is no doubt that the long road to understanding science is the prerequisite to mapping the images from an equation's signs and symbols. While not sitting in the classroom of the teachers it was necessary to try and move into the fifth dimensional referencing of our computer screen to see what is being extolled here not just in image development, but of what the physics is doing in relation.

    In 1849 already, the British mathematicians Salmon ([Sal49]) and Cayley ([Cay49]) published the results of their correspondence on the number of straight lines on a smooth cubic surface. In a letter, Cayley had told Salmon, that their could only exist a finite number - and Salmon answered, that the number should be exactly 27



    So of course to be the historical journey was established like most things, Mandelstam current and what is happening there as an interlude, as well as helping to establish some understanding of the abstractions that had been developed.



    But yes, before moving to current day imagery and abstraction, I had to understand how these developments were being tackled in today's theoretical sciences.

    Sunday, December 10, 2006

    Universal Library

    Commerce is of trivial import; love, faith, truth of character, the aspiration of man, these are sacred.Ralph Waldo Emerson




    "It is perhaps the oldest university in the world."


    Can you imagine if one might have been restricted from the museums of history, based on what another might have thought of the person? To encourage such ideas to blossom, that it is understood the garden has to provide a source from which things can grow. Why not circumvent all views other then one's own, and you shall own those person's too.

    If we are to keep one in "ignorance of life" then why not circumvent them to what the world is for them in "their sections and houses on earth? Keep them, to the culture, and not allow for the greater dialogue between these cultures?

    While the historical blend here is being extolled, I of course have current thoughts about this in todays world of the internet.


    Reconstruction of one of the storage rooms of the Library of Alexandria. From Carl Sagan's Cosmos (1980),
    The Royal Library of Alexandria in Alexandria, Egypt, was once the largest library in the world. It is generally thought to have been founded at the beginning of the 3rd century BC, during the reign of Ptolemy II of Egypt. It was likely created after his father had built what would become the first part of the library complex, the temple of the Muses — the Musaion (from which is derived the modern English word museum).

    It has been reasonably established that the library, or parts of the collection, were destroyed by fire on a number of occasions (library fires were common enough and replacement of handwritten manuscripts was very difficult, expensive and time-consuming). To this day the details of the destruction (or destructions) remain a lively source of controversy. The Bibliotheca Alexandrina was inaugurated in 2003 near the site of the old library.


    Now you know that I believe that the resource for such potentials is very capable in anyone's hands. That if they would like to draw from such a resource, that maybe it has to be physical for them. So, they may go to the library.Yet there is the "sublty of the intangile" that is not accepted by those who are "deeply physical" about what they can accept, so they can accept such libraries.

    Then again one might think twice about what is in the library of the internet? Yet, it is not without the "subtleness of the intangible" that we see where the "good thoughts/ideas can issue from the expert and the lay person alike. That such things become part of the library of the internet.

    How do we know in our heart when such information is true? That we can rest assure that such dangers of misleading do not take us into their world? Do they some how control you by what they like to hear?

    Innatism is a philosophical doctrine introduced by Plato in the socratic dialogue Meno which holds that the mind is born with ideas/knowledge, and that therefore the mind is not a tabula rasa at birth. It asserts therefore that not all knowledge is obtained from experience and the senses. Innatism is the opposite of empiricism.

    Plato claimed that humans are born with ideas/forms in the mind that are in a dormant state. He claimed that we have acquired these ideas prior to our birth when we existed as souls in the world of Forms. To access these, humans need to be reminded of them through proper education and experience.


    Or are we gifted with this innatism about what is good in all people, while there are those who would become rich by such restrictions of a "software selection."

    The French librarian Gabriel Naudé wrote:

    And therefore I shall ever think it extreamly necessary, to collect for this purpose all sorts of books, (under such precautions, yet, as I shall establish) seeing a Library which is erected for the public benefit, ought to be universal; but which it can never be, unlesse it comprehend all the principal authors, that have written upon the great diversity of particular subjects, and chiefly upon all the arts and sciences; [...] For certainly there is nothing which renders a Library more recommendable, then when every man findes in it that which he is in search of


    I mean, if we were restricted to the ability to retrieve from the massive amounts of data being presented, do you think it a good thing to restrict people from being able to develope their intellect? Learn more?