Saturday, February 20, 2010

Economy, as Science


A shift in paradigm can lead, via the theory-dependence of observation, to a difference in one's experiences of things and thus to a change in one's phenomenal world.ON Thomas Kuhn
 
Control the information you control the people?:) Again as heart felt and idealistic you can become in your efforts, it's not enough to cry out in political verbiage because you'll always end up with another person saying that it is only a political perspective. That it is the progressive conservative you don't like and their leader? It's not enough.

So what do you do?

Do you succumb to the frustration that what is moving as a sub-culture working from the inside/out, is the idea that you can build a better consensus from what is moving the fabric of society to know that we can change the outcome as to what Canada shall become as well?

They( a conspirator thought) as a force that is undermining the public perception while society did not grasp the full understanding of what has been done to them. Society having been cast to fighting at the "local level to advance a larger agenda?"

Does it not seem that once you occupy the mind in such close quarter conflagrations that mind has been circumvented from the larger picture?

Pain, and emotional turmoil does this.

Historically once the fire has been started, like some phoenix, a new cultural idealism manifests as to what the individual actually wants when they are in full recognition that "as a force" moved forward in a democratic compunction as a government in waiting to advance the principals by which it can stand as the public mind.


However, the incommensurability thesis is not Kuhn's only positive philosophical thesis. Kuhn himself tells us that “The paradigm as shared example is the central element of what I now take to be the most novel and least understood aspect of [The Structure of Scientific Revolutions]” (1970a, 187). Nonetheless, Kuhn failed to develop the paradigm concept in his later work beyond an early application of its semantic aspects to the explanation of incommensurability. The explanation of scientific development in terms of paradigms was not only novel but radical too, insofar as it gives a naturalistic explanation of belief-change. Naturalism was not in the early 1960s the familiar part of philosophical landscape that it has subsequently become. Kuhn's explanation contrasted with explanations in terms of rules of method (or confirmation, falsification etc.) that most philosophers of science took to be constitutive of rationality. Furthermore, the relevant disciplines (psychology, cognitive science, artificial intelligence) were either insufficiently progressed to support Kuhn's contentions concerning paradigms, or were antithetical to them (in the case of classical AI). Now that naturalism has become an accepted component of philosophy, there has recently been interest in reassessing Kuhn's work in the light of developments in the relevant sciences, many of which provide corroboration for Kuhn's claim that science is driven by relations of perceived similarity and analogy to existing problems and their solutions (Nickles 2003b, Nersessian 2003). It may yet be that a characteristically Kuhnian thesis will play a prominent part in our understanding of science.
I would advance that the word "science" in quote above, be changed to "economy."

What paradigmatic solution has been advanced that such a thing can turn over the present equatorial function assigned to the pubic mind, that we will be in better control of our destinies as Canadians?

Precursor to such changes are revolutions in the thought patterns established as functionary pundits of money orientated societies. They have become "fixed to a particular agenda." Rote systems assumed and brought up in,  extolled as to the highest moral obligation is  to live well, and on the way, fix ourselves to debt written obligations that shall soon over come the sensibility of what it shall take to live?

Force upon them is the understanding that we had become a slave to our reason and a slave to a master disguised as what is healthy and knows no boundaries? A capitalistic dream.

Update:

Money Supply and Energy: Is The Economy Inherently Unstable?

 

Tuesday, February 16, 2010

Article From New York Times and More




Brookhaven National Laboratory

HOT A computer rendition of 4-trillion-degree Celsius quark-gluon plasma created in a demonstration of what scientists suspect shaped cosmic history.

In Brookhaven Collider, Scientists Briefly Break a Law of Nature

The Brookhaven scientists and their colleagues discussed their latest results from RHIC in talks and a news conference at a meeting of the American Physical Society Monday in Washington, and in a pair of papers submitted to Physical Review Letters. “This is a view of what the world was like at 2 microseconds,” said Jack Sandweiss of Yale, a member of the Brookhaven team, calling it, “a seething cauldron.”

Among other things, the group announced it had succeeded in measuring the temperature of the quark-gluon plasma as 4 trillion degrees Celsius, “by far the hottest matter ever made,” Dr. Vigdor said. That is 250,000 times hotter than the center of the Sun and well above the temperature at which theorists calculate that protons and neutrons should melt, but the quark-gluon plasma does not act the way theorists had predicted.

Instead of behaving like a perfect gas, in which every quark goes its own way independent of the others, the plasma seemed to act like a liquid. “It was a very big surprise,” Dr. Vigdor said, when it was discovered in 2005. Since then, however, theorists have revisited their calculations and found that the quark soup can be either a liquid or a gas, depending on the temperature, he explained. “This is not your father’s quark-gluon plasma,” said Barbara V. Jacak, of the State University at Stony Brook, speaking for the team that made the new measurements.

It is now thought that the plasma would have to be a million times more energetic to become a perfect gas. That is beyond the reach of any conceivable laboratory experiment, but the experiments colliding lead nuclei in the Large Hadron Collider outside Geneva next winter should reach energies high enough to see some evolution from a liquid to a gas.
See more at above link.

***

Violating Parity with Quarks and Gluons
by Sean Carroll of Cosmic Variance
This new result from RHIC doesn’t change that state of affairs, but shows how quarks and gluons can violate parity spontaneously if they are in the right environment — namely, a hot plasma with a magnetic field.

So, okay, no new laws of physics. Just a much better understanding of how the existing ones work! Which is most of what science does, after all
.

***

Quark–gluon plasma

From Wikipedia, the free encyclopedia

A QGP is formed at the collision point of two relativistically accelerated gold ions in the center of the STAR detector at the relativistic heavy ion collider at the Brookhaven national laboratory.


A quark-gluon plasma (QGP) or quark soup[1] is a phase of quantum chromodynamics (QCD) which exists at extremely high temperature and/or density. This phase consists of (almost) free quarks and gluons, which are the basic building blocks of matter. Experiments at CERN's Super Proton Synchrotron (SPS) first tried to create the QGP in the 1980s and 1990s: the results led CERN to announce indirect evidence for a "new state of matter"[2] in 2000. Current experiments at Brookhaven National Laboratory's Relativistic Heavy Ion Collider (RHIC) are continuing this effort.[3] Three new experiments running on CERN's Large Hadron Collider (LHC), ALICE,[4] ATLAS and CMS, will continue studying properties of QGP.

Contents

  • 1 General introduction


    • 1.1 Why this is referred to as "plasma"
    • 1.2 How the QGP is studied theoretically
    • 1.3 How it is created in the lab
    • 1.4 How the QGP fits into the general scheme of physics
  • 2 Expected properties


    • 2.1 Thermodynamics
    • 2.2 Flow
    • 2.3 Excitation spectrum
  • 3 Experimental situation
  • 4 Formation of quark matter
  • 5 See also
  • 6 References
  • 7 External links

General introduction

The quark-gluon plasma contains quarks and gluons, just as normal (baryonic) matter does. The difference between these two phases of QCD is that in normal matter each quark either pairs up with an anti-quark to form a meson or joins with two other quarks to form a baryon (such as the proton and the neutron). In the QGP, by contrast, these mesons and baryons lose their identities and dissolve into a fluid of quarks and gluons.[5] In normal matter quarks are confined; in the QGP quarks are deconfined.
Although the experimental high temperatures and densities predicted as producing a quark-gluon plasma have been realized in the laboratory, the resulting matter does not behave as a quasi-ideal state of free quarks and gluons, but, rather, as an almost perfect dense fluid.[6] Actually the fact that the quark-gluon plasma will not yet be "free" at temperatures realized at present accelerators had been predicted already in 1984 [7] as a consequence of the remnant effects of confinement. 

Why this is referred to as "plasma"

A plasma is matter in which charges are screened due to the presence of other mobile charges; for example: Coulomb's Law is modified to yield a distance-dependent charge. In a QGP, the color charge of the quarks and gluons is screened. The QGP has other analogies with a normal plasma. There are also dissimilarities because the color charge is non-abelian, whereas the electric charge is abelian. Outside a finite volume of QGP the color electric field is not screened, so that volume of QGP must still be color-neutral. It will therefore, like a nucleus, have integer electric charge.

How the QGP is studied theoretically

One consequence of this difference is that the color charge is too large for perturbative computations which are the mainstay of QED. As a result, the main theoretical tools to explore the theory of the QGP is lattice gauge theory. The transition temperature (approximately 175 MeV) was first predicted by lattice gauge theory. Since then lattice gauge theory has been used to predict many other properties of this kind of matter. The AdS/CFT correspondence is a new interesting conjecture allowing insights in QGP.

How it is created in the lab

The QGP can be created by heating matter up to a temperature of 2×1012 kelvin, which amounts to 175 MeV per particle. This can be accomplished by colliding two large nuclei at high energy (note that 175 MeV is not the energy of the colliding beam). Lead and gold nuclei have been used for such collisions at CERN SPS and BNL RHIC, respectively. The nuclei are accelerated to ultrarelativistic speeds and slammed into each other while Lorentz contracted. They largely pass through each other, but a resulting hot volume called a fireball is created after the collision. Once created, this fireball is expected to expand under its own pressure, and cool while expanding. By carefully studying this flow, experimentalists hope to put the theory to test.

How the QGP fits into the general scheme of physics

QCD is one part of the modern theory of particle physics called the Standard Model. Other parts of this theory deal with electroweak interactions and neutrinos. The theory of electrodynamics has been tested and found correct to a few parts in a trillion. The theory of weak interactions has been tested and found correct to a few parts in a thousand. Perturbative aspects of QCD have been tested to a few percent. In contrast, non-perturbative aspects of QCD have barely been tested. The study of the QGP is part of this effort to consolidate the grand theory of particle physics.
The study of the QGP is also a testing ground for finite temperature field theory, a branch of theoretical physics which seeks to understand particle physics under conditions of high temperature. Such studies are important to understand the early evolution of our universe: the first hundred microseconds or so. While this may seem esoteric, this is crucial to the physics goals of a new generation of observations of the universe (WMAP and its successors). It is also of relevance to Grand Unification Theories or 'GUTS' which seek to unify the four fundamental forces of nature.

Expected properties

Thermodynamics

The cross-over temperature from the normal hadronic to the QGP phase is about 175 MeV, corresponding to an energy density of a little less than 1 GeV/fm3. For relativistic matter, pressure and temperature are not independent variables, so the equation of state is a relation between the energy density and the pressure. This has been found through lattice computations, and compared to both perturbation theory and string theory. This is still a matter of active research. Response functions such as the specific heat and various quark number susceptibilities are currently being computed.

Flow

The equation of state is an important input into the flow equations. The speed of sound is currently under investigation in lattice computations. The mean free path of quarks and gluons has been computed using perturbation theory as well as string theory. Lattice computations have been slower here, although the first computations of transport coefficients have recently been concluded. These indicate that the mean free time of quarks and gluons in the QGP may be comparable to the average interparticle spacing: hence the QGP is a liquid as far as its flow properties go. This is very much an active field of research, and these conclusions may evolve rapidly. The incorporation of dissipative phenomena into hydrodynamics is another recent development that is still in an active stage.

Excitation spectrum

Does the QGP really contain (almost) free quarks and gluons? The study of thermodynamic and flow properties would indicate that this is an over-simplification. Many ideas are currently being evolved and will be put to test in the near future. It has been hypothesized recently that some mesons built from heavy quarks (such as the charm quark) do not dissolve until the temperature reaches about 350 MeV. This has led to speculation that many other kinds of bound states may exist in the plasma. Some static properties of the plasma (similar to the Debye screening length) constrain the excitation spectrum.

Experimental situation

Those aspects of the QGP which are easiest to compute are not the ones which are the easiest to probe in experiments. While the balance of evidence points towards the QGP being the origin of the detailed properties of the fireball produced in the RHIC, this is the main barrier which prevents experimentalists from declaring a sighting of the QGP. For a summary see 2005 RHIC Assessment.
The important classes of experimental observations are

Formation of quark matter

In April 2005, formation of quark matter was tentatively confirmed by results obtained at Brookhaven National Laboratory's Relativistic Heavy Ion Collider (RHIC). The consensus of the four RHIC research groups was that they had created a quark-gluon liquid of very low viscosity. However, contrary to what was at that time still the widespread assumption, it is yet unknown from theoretical predictions whether the QCD "plasma", especially close to the transition temperature, should behave like a gas or liquid[8]. Authors favoring the weakly interacting interpretation derive their assumptions from the lattice QCD calculation, where the entropy density of quark-gluon plasma approaches the weakly interacting limit. However, since both energy density and correlation shows significant deviation from the weakly interacting limit, it has been pointed out by many authors that there is in fact no reason to assume a QCD "plasma" close to the transition point should be weakly interacting, like electromagnetic plasma (see, e.g., [9]).

See also

References


External links

Friday, February 12, 2010

The Last Question by Isaac Asimov

The problem of heat can be a frustrating one if one can contend with the computer chips and how this may of resulted in a reboot of the machine( or it's death) into a better state of existence then what was previously used in working model form.

So the perfection is to the very defining model of a super race that is devoid of all the trappings in human form that can be ruled by the mistakes of combining body parts from Frankenstein sense to what the new terminator models have in taken over..but they are not human?

Multivac is a advanced computer that solves many of the world’s problems. The story opens on May 14, 2061 when Multivac has built a space station to harness the power of the sun – effectively giving humans access to a nearly unlimited source of power. Ah – and that’s the key, it is nearly unlimited. In fact two of Multivac’s technicians argue about this very idea – how long will humankind be able to glean energy from the universe? They decide to ask Multivac for the answer, and all it can say is “INSUFFICIENT DATA FOR MEANINGFUL ANSWER.” Oh well, it was a good idea, and through several smaller stories we see that many more people ask Multivac the same question. Multivac has a difficult time answering – it is a hard question after all! But when do we (and Multivac) finally learn the answer? As you’ve probably guessed – not until the very end of the story.
“You ask Multivac. I dare you. Five dollars says it can’t be done.”
“Adell was just drunk enough to try, just sober enough to be able to phrase the necessary symbols and operations into a question which, in words, might have corresponded to this: Will mankind one day without the net expenditure of energy be able to restore the sun to its full youthfulness even after it had died of old age?
Or maybe it could be put more simply like this: How can the net amount of entropy of the universe be massively decreased?
Multivac fell dead and silent. The slow flashing of lights ceased, the distant sounds of clicking relays ended.
Then, just as the frightened technicians felt they could hold their breath no longer, there was a sudden springing to life of the teletype attached to that portion of Multivac. Five words were printed: INSUFFICIENT DATA FOR MEANINGFUL ANSWER.

***

Timeframe for heat death

From the Big Bang through the present day and well into the future, matter and dark matter in the universe is concentrated in stars, galaxies, and galaxy clusters. Therefore, the universe is not in thermodynamic equilibrium and objects can do physical work.[11], §VID. The decay time of a roughly galaxy-mass (1011 solar masses) supermassive black hole due to Hawking radiation is on the order of 10100 years,[12], so entropy can be produced until at least that time. After that time, the universe enters the so-called dark era, and is expected to consist chiefly of a dilute gas of photons and leptons.[11], §VIA. With only very diffuse matter remaining, activity in the universe will have tailed off dramatically, with very low energy levels and very large time scales. Speculatively, it is possible that the Universe may enter a second inflationary epoch, or, assuming that the current vacuum state is a false vacuum, the vacuum may decay into a lower-energy state.[11], §VE. It is also possible that entropy production will cease and the universe will achieve heat death.[11], §VID.

*** 

Creating the Perfect Human Being or Maybe.....

..... a Frankenstein? :)

Artificial Intelligence (AI) is the intelligence of machines and the branch of computer scienceintelligent agents,"[1] where an intelligent agent is a system that perceives its environment and takes actions which maximize its chances of success.[2] John McCarthy, who coined the term in 1956,[3][4] which aims to create it. Textbooks define the field as "the study and design of defines it as "the science and engineering of making intelligent machines."
The field was founded on the claim that a central property of humans, intelligence—the sapience of Homo sapiens—can be so precisely described that it can be simulated by a machine.[5] This raises philosophical issues about the nature of the mind and limits of scientific hubris, issues which have been addressed by myth, fiction and philosophy since antiquity.[6] Artificial intelligence has been the subject of breathtaking optimism,[7] has suffered stunning setbacks[8][9] and, today, has become an essential part of the technology industry, providing the heavy lifting for many of the most difficult problems in computer science.
AI research is highly technical and specialized, deeply divided into subfields that often fail to communicate with each other.[10] Subfields have grown up around particular institutions, the work of individual researchers, the solution of specific problems, longstanding differences of opinion about how AI should be done and the application of widely differing tools. The central problems of AI include such traits as reasoning, knowledge, planning, learning, communication, perception and the ability to move and manipulate objects.[11] General intelligence (or "strong AI") is still a long-term goal of (some) research.[12]

Wednesday, February 10, 2010

GOCE delivering data for best gravity map ever



30 September 2009
Following the launch and in-orbit testing of the most sophisticated gravity mission ever built, ESA’s GOCE satellite is now in ‘measurement mode’, mapping tiny variations in Earth’s gravity in unprecedented detail.
 



The ‘Gravity field and steady-state Ocean Circulation Explorer’ (GOCE) satellite was launched on 17 March from northern Russia. The data now being received will lead to a better understanding of Earth’s gravity, which is important for understanding how our planet works.



It is often assumed that gravity exerts an equal force everywhere on Earth. However, owing to factors such as the rotation of the planet, the effects of mountains and ocean trenches, and density variations in Earth’s interior, this fundamental force is not quite the same all over.

Credit:ESA


Over two six-month uninterrupted periods, GOCE will map these subtle variations with extreme detail and accuracy. This will result in a unique model of the ‘geoid’ – the surface of an ideal global ocean at rest.


A precise knowledge of the geoid is crucial for accurate measurement of ocean circulation and sea-level change, both of which are influenced by climate. The data from GOCE are also much-needed to understand the processes occurring inside Earth. In addition, by providing a global reference to compare heights anywhere in the world, the GOCE-derived geoid will be used for practical applications in areas such as surveying and levelling.See More here and here

See:Plato's Nightlight Mining Company is claiming Aristarchus Crater and Surrounding Region and the rest is history:)

Saturday, February 06, 2010

A New Time Travel Scenario?

Black Hole-Powered Jet of Electrons and Sub-Atomic Particles Streams From Center of Galaxy M87

NASA's Hubble Space Telescope Yields Clear View of Optical Jet in Galaxy M87

A NASA Hubble Space Telescope (HST) view of a 4,000 light-year long jet of plasma emanating from the bright nucleus of the giant elliptical galaxy M87. This ultraviolet light image was made with the European Space Agency's Faint Object Camera (FOC), one of two imaging systems aboard HST. This photo is being presented on Thursday, January 16th at the 179th meeting of the American Astronomical Society meeting in Atlanta, Georgia. M87 is a giant elliptical galaxy with an estimated mass of 300 billion suns. Located 52 million light-years away at the heart of the neighboring Virgo cluster of galaxies, M87 is the nearest example of an active galactic nucleus with a bright optical jet. The jet appears as a string of knots within a widening cone extending out from the core of M87. The FOC image reveals unprecedented detail in these knots, resolving some features as small as ten light-years across. According to one theory, the jet is most likely powered by a 3 billion solar mass black hole at the nucleus of M87. Magnetic fields generated within a spinning accretion disk surrounding the black hole, spiral around the edge of the jet. The fields confine the jet to a long narrow tube of hot plasma and charged particles. High speed electrons and protons which are accelerated near the black hole race along the tube at nearly the speed of light. When electrons are caught up in the magnetic field they radiate in a process called synchrotron radiation. The Faint Object Camera image clearly resolves these localized electron acceleration, which seem to trace out the spiral pattern of the otherwise invisible magnetic field lines. A large bright knot located midway along the jet shows where the blue jet disrupts violently and becomes more chaotic. Farther out from the core the jet bends and dissipates as it rams into a wall of gas, invisible but present throughout the galaxy which the jet has plowed in front of itself. HST is ideally suited for studying extragalactic jets. The Telescope's UV sensitivity allows it to clearly separate a jet from the stellar background light of its host galaxy. What's more, the FOC's high angular resolution is comparable to sub arc second resolution achieved by large radio telescope arrays.
See:Hubble Site>




***




Willem Jacob van Stockum (November 20, 1910-June 10, 1944) was a mathematician who made an important contribution to the early development of general relativity.

Van Stockum was born in Hattem in the Netherlands. His father was a mechanically talented officer in the Dutch Navy. After the family (less the father) relocated to Ireland in the late 1920s, Willem studied mathematics at the Trinity College, Dublin, where he earned a gold medal. He went on to earn an M.A. from the University of Toronto and his Ph.D. from University of Edinburgh.

In the mid nineteen thirties, van Stockum became an early enthusiast of the then new theory of gravitation, general relativity. In 1937, he published a paper which contains one of the first exact solutions in general relativity which modeled the gravitational field produced by a configuration of rotating matter, the van Stockum dust, which remains an important example noted for its unusual simplicity. In this paper, van Stockum was apparently the first to notice the possibility of closed timelike curves, one of the strangest and most disconcerting phenomena in general relativity.
***
The chronology protection conjecture is a conjecture by the physicist Professor Stephen Hawking that the laws of physics are such as to prevent time travel on all but sub-microscopic scales. Mathematically, the permissibility of time travel is represented by the existence of closed timelike curves.

***



Tipler Cylinder

An Overview and Comparison by Dr. David Lewis Anderson

A Tipler Cylinder uses a massive and long cylinder spinning around its longitudinal axis. The rotation creates a frame-dragging effect and fields of closed time-like curves traversable in a way to achieve subluminal time travel to the past.

***

We see a pulsar, then, when one of its beams of radiation crosses our line-of-sight. In this way, a pulsar is like a lighthouse. The light from a lighthouse appears to be "pulsing" because it only crosses our line-of-sight once each time it spins. Similarly, a pulsar "pulses" because we see bright flashes every time the star spins.
See: Pulsars

Thursday, February 04, 2010

Perspective of the Theoretical Scientist

Most people think of "seeing" and "observing" directly with their senses. But for physicists, these words refer to much more indirect measurements involving a train of theoretical logic by which we can interpret what is "seen."- Lisa Randall

There are certain advantages to the theoretical perspective that can best portray the concepts of the world they live in with what appears, however abstract, with the minds value of image solicitor impressionism which helps the minds state of acceptance. So it had to be explained first.

Cubist art revolted against the restrictions that perspective imposed. Picasso's art shows a clear rejection of the perspective, with women's faces viewed simultaneously from several angles. Picasso's paintings show multiple perspectives, as though they were painted by someone from the 4th dimension, able to see all perspectives simultaneously.


Cubist Art: Picasso's painting 'Portrait of Dora Maar'



P. Picasso Portrait of Ambrose Vollard (1910)


 M. Duchamp Nude Descending a Staircase, No. 2 (1912)


J. Metzinger Le Gouter/Teatime (1911)


The appearance of figures in cubist art --- which are often viewed from several direction simultaneously --- has been linked to ideas concerning extra dimensions:

As if, looking at it from a larger perspective. If you stand outside of the image and see that it is capable of illuminating many angles of perspective. This helped us to see that it is derived from a much larger understanding then what is solidified to the everyday we live in.

For the artist it was a bold move to understanding that perspective could help us see Mona Lisa's smile as moving with us as we move around. So that was the challenge then was to appreciate the value of this artistic push into how we see as to understanding the road non- euclidean took was meet by people as well to culminate in a geometrical transitional form


Hyperspace: A Scientific Odyssey

A look at the higher dimensionsBy Michio Kaku



"Why must art be clinically “realistic?” This Cubist “revolt against perspective” seized the fourth dimension because it touched the third dimension from all possible perspectives. Simply put, Cubist art embraced the fourth dimension. Picasso's paintings are a splendid example, showing a clear rejection of three dimensional perspective, with women's faces viewed simultaneously from several angles. Instead of a single point-of-view, Picasso's paintings show multiple perspectives, as if they were painted by a being from the fourth dimension, able to see all perspectives simultaneously. As art historian Linda Henderson has written, “the fourth dimension and non-Euclidean geometry emerge as among the most important themes unifying much of modern art and theory."







Then, it quickly comes home to mind that maybe what is given,  lets say in context of Lee Smolin's road to Quantum Gravity of the thing will help us quickly see the value of describing "the space of an interior" with what is happening on the screen/label.

Spacetime in String Theory


More then just a Bekenstein imagery to illustrate a conformal approach to describing what are the contends of the tomato soup can from it's label.



Campbell's Soup Can by Andy Warhol Exhibited in New York (USA), Leo Castelli Gallery


It was necessary to see that the geometric used here were helping to shape perspective around not only "time travel" but a means to an end to use mathematical perspective to actually mean something in relation to understanding our world. A way to describe abstract concepts that were correlated with the progression of those mathematics. Klein's ordering of geometries then take on a new meaning as we move deep into the world we all know and love.

In 1919, Kaluza sent Albert Einstein a preprint --- later published in 1921 --- that considered the extension of general relativity to five dimensions. He assumed that the 5-dimensional field equations were simply the higher-dimensional version of the vacuum Einstein equation, and that all the metric components were independent of the fifth coordinate. The later assumption came to be known as the cylinder condition. This resulted in something remarkable: the fifteen higher-dimension field equations naturally broke into a set of ten formulae governing a tensor field representing gravity, four describing a vector field representing electromagnetism, and one wave equation for a scalar field. Furthermore, if the scalar field was constant, the vector field equations were just Maxwell's equations in vacuo, and the tensor field equations were the 4-dimensional Einstein field equations sourced by an EM field. In one fell swoop, Kaluza had written down a single covariant field theory in five dimensions that yielded the four dimensional theories of general relativity and electromagnetism. Naturally, Einstein was very interested in this preprint .

I quickly divert the attention to the world of Thomas Banchoff because it is an extraordinary move from all that we know is safe. It is not lost to some computer animator world that one engages loses the self in the process? It is also to show that what Lee Smolin tried to distance himself from, was in fact seeking to find itself understood in this way. Concurrent agreement that theoretics was trying to arrive at a consensus of different approaches saying the same thing?

Monte Carlo methods are a class of computational algorithms that rely on repeated random sampling to compute their results. Monte Carlo methods are often used in simulating physical and mathematical systems. Because of their reliance on repeated computation of random or pseudo-random numbers, these methods are most suited to calculation by a computer and tend to be used when it is unfeasible or impossible to compute an exact result with a deterministic algorithm.[1]

Monte Carlo simulation methods are especially useful in studying systems with a large number of coupled degrees of freedom, such as fluids, disordered materials, strongly coupled solids, and cellular structures (see cellular Potts model). More broadly, Monte Carlo methods are useful for modeling phenomena with significant uncertainty in inputs, such as the calculation of riskdefinite integrals, particularly multidimensional integrals with complicated boundary conditions. It is a widely successful method in risk analysis when compared with alternative methods or human intuition. When Monte Carlo simulations have been applied in space exploration and oil exploration, actual observations of failures, cost overruns and schedule overruns are routinely better predicted by the simulations than by human intuition or alternative "soft" methods.[2]
For me it had to make some sense such transference from that artistic impressionism help to direct the mind to the ways and means of understanding quantum gravity was being inspected in terms of Monte Carlo methods to understanding. These had a surface value in my mind to an accumulate acceptance of the geometry and methods used to model this understanding.




So you understand now how we arrived at an interpretation of the value of lets say Dyson's opinion about how we might view Riemann's Hypothesis?

Dyson, one of the most highly-regarded scientists of his time, poignantly informed the young man that his findings into the distribution of prime numbers corresponded with the spacing and distribution of energy levels of a higher-ordered quantum state. Mathematics Problem That Remains Elusive —And Beautiful By Raymond Petersen



***

DNA Computing

DNA computing is a form of computing which uses DNA, biochemistry and molecular biology, instead of the traditional silicon-based computer technologies. DNA computing, or, more generally, molecular computing, is a fast developing interdisciplinary area. Research and development in this area concerns theory, experiments and applications of DNA computing See:DNA computing




***


Clifford of Asymptotia is hosting a guest post by Len Adleman: Quantum Mechanics and Mathematical Logic.

Today I’m pleased to announce that we have a guest post from a very distinguished colleague of mine, Len Adleman. Len is best known as the “A” in RSA and the inventor of DNA-computing. He is a Turing Award laureate. However, he considers himself “a rank amateur” (his words!) as a physicist.

Len Adleman-For a long time, physicists have struggled with perplexing “meta-questions” (my phrase): Does God play dice with the universe? Does a theory of everything exist? Do parallel universes exist? As the physics community is acutely aware, these are extremely difficult questions and one may despair of ever finding meaningful answers. The mathematical community has had its own meta-questions that are no less daunting: What is “truth”? Do infinitesimals exist? Is there a single set of axioms from which all of mathematics can be derived? In what many consider to be on the short list of great intellectual achievements, Frege, Russell, Tarski, Turing, Godel, and other logicians were able to clear away the fog and sort these questions out. The framework they created, mathematical logic, has put a foundation under mathematics, provided great insights and profound results. After many years of consideration, I have come to believe that mathematical logic, suitably extended and modified (perhaps to include complexity theoretic ideas), has the potential to provide the same benefits to physics. In the following remarks, I will explore this possibility.

Wednesday, February 03, 2010

Different Approaches to a 5d world

Smolin: And there are published predictions for observable Planck scale deviations from energy momentum relations[22, 23] that imply predictions for experiments in progress such as AUGER and GLAST. [B]For those whose interest is more towards formal speculations concerning supersymmetry and higher dimensions than experiment, there are also results that show how the methods of loop quantum gravity may be extended to give background independent descriptions of quantum gravity in the higher and super realms[31]-[35][/B]. It thus seems like a good time for an introduction to the whole approach that may help to make the basic ideas, results and methods accessible to a wider range of physicists.

Dealing With a 5d World

I was trying to understand that once you get to see how the equation leads you too a understanding of that 5d world it allowed you to entertain all possibility based on this position.



Extra dimensions sound like science fiction, but they could be part of the real world. And if so, they might help explain mysteries like why the universe is expanding faster than expected, and why gravity is weaker than the other forces of nature.
Three dimensions are all we see -- how could there be any more? Einstein's general theory of relativity tells us that space can expand, contract, and bend. If one direction were to contract down to an extremely tiny size, much smaller than an atom, it would be hidden from our view. If we could see on small enough scales, that hidden dimension might become visible.

Here are some thoughts to consider?:)


Klein's Ordering of Geometries

A theorem which is valid for a geometry in this sequence is automatically valid for the ones that follow. The theorems of projective geometry are automatically valid theorems of Euclidean geometry. We say that topological geometry is more abstract than projective geometry which is turn is more abstract than Euclidean geometry.

A VIEW OF MATHEMATICS by Alain CONNES
Most mathematicians adopt a pragmatic attitude and see themselves as the explorers of this mathematical world" whose existence they don't have any wish to question, and whose structure they uncover by a mixture of intuition, not so foreign from poetical desire", and of a great deal of rationality requiring intense periods of concentration.

Each generation builds a mental picture" of their own understanding of this world and constructs more and more penetrating mental tools to explore previously hidden aspects of that reality.


Nature's Greastest Puzzle







This is a torus (like a doughnut) on which several circles are located. Unlike on a Euclidean plane, on this surface it is impossible to determine which circle is inside of which, since if you go from the black circle to the blue, to the red, and to the grey, you can continuously come back to the initial black, and likewise if you go from the black to the grey, to the red, and to the blue, you can also come back to the black.

Reichenbach then invites us to consider a 3-dimensional case (spheres instead of circles).






Figure 8 [replaced by our Figure 2] is to be conceived three-dimensionally, the circles being cross-sections of spherical shells in the plane of the drawing. A man is climbing about on the huge spherical surface 1; by measurements with rigid rods he recognizes it as a spherical shell, i.e. he finds the geometry of the surface of a sphere. Since the third dimension is at his disposal, he goes to spherical shell 2. Does the second shell lie inside the first one, or does it enclose the first shell? He can answer this question by measuring 2. Assume that he finds 2 to be the smaller surface; he will say that 2 is situated inside of 1. He goes now to 3 and finds that 3 is as large as 1.

How is this possible? Should 3 not be smaller than 2? ...

He goes on to the next shell and finds that 4 is larger than 3, and thus larger than 1. ... 5 he finds to be as large as 3 and 1.

But here he makes a strange observation. He finds that in 5 everything is familiar to him; he even recognizes his own room which was built into shell 1 at a certain point. This correspondence manifests itself in every detail; ... He is quite dumbfounded since he is certain that he is separated from surface 1 by the intervening shells. He must assume that two identical worlds exist, and that every event on surface 1 happens in an identical manner on surface 5. (Reichenbach 1958, 63-64)





THOMAS BANCHOFF has been a professor of mathematics at Brown University in Providence, Rhode Island, since 1967. He has written two books and fifty articles on geometric topics, frequently incorporating interactive computer graphics techniques in the study of phenomena in the fourth and higher dimensions


Today, however, we do have the opportunity not only to observe phenomena in four and higher dimensions, but we can also interact with them. The medium for such interaction is computer graphics. Computer graphic devices produce images on two-dimensional screens. Each point on the screen has two real numbers as coordinates, and the computer stores the locations of points and lists of pairs of points which are to be connected by line segments or more complicated curves. In this way a diagram of great complexity can be developed on the screen and saved for later viewing or further manipulation


Current research said something abut how the brain/mind can assume the reality in terms of randomness or end up realizing some chaotic function?  Well,  if such chaos is measured in the heat of thinking I am surprised we do not end up in some brain/mind heat death?:)

Monday, January 25, 2010

Poincaré Hyperbolic Disk

"Poincaré Hyperbolic Disk" from the Wolfram Demonstrations Project

See also:Poincaré Hyperbolic Disk

***

Hyperbolic Geometry


Geometric models of hyperbolic geometry include the Klein-Beltrami model, which consists of an open disk in the Euclidean plane whose open chords correspond to hyperbolic lines. A two-dimensional model is the Poincaré hyperbolic disk.

Weisstein, Eric W. "Hyperbolic Geometry." From MathWorld--A Wolfram Web Resource. http://mathworld.wolfram.com/HyperbolicGeometry.html
 
***






A computer-generated image showing the pattern of a p-mode solar acoustic oscillation both in the interior and on the surface of the sun. (l=20, m=16 and n=14.) Note that the increase in the speed of sound as waves approach the center of the sun causes a corresponding increase in the acoustic wavelength.

Helioseismology is the study of the propagation of wave oscillations, particularly acoustic pressure waves, in the Sun.

***

SOHO Reads the Solar Flares



Measurements of the Sun's oscillations provide a window into the invisible interior of the Sun allowing scientists to infer the structure and composition as well as the rotation and dynamics of the solar interior.

(Extreme ultraviolet Imaging Telescope) images the solar atmosphere at several wavelengths, and therefore, shows solar material at different temperatures. In the images taken at 304 Angstroms the bright material is at 60,000 to 80,000 degrees Kelvin. In those taken at 171, at 1 million degrees. 195 Angstrom images correspond to about 1.5 million Kelvin. 284 Angstrom, to 2 million degrees. The hotter the temperature, the higher you look in the solar atmosphere.


p-Modes

The mysterious source of these oscillations was identified by way of theoretical arguments in 1970 and confirmed by observations in 1975. The oscillations we see on the surface are due to sound waves generated and trapped inside the sun. Sound waves are produced by pressure fluctuations in the turbulent convective motions of the sun's interior. As the waves move outward they reflect off of the sun's surface (the photosphere) where the density and pressure decrease rapidly..


*** 
It's Effect on Earth




The plots on this page show the current extent and position of the auroral oval at each pole, extrapolated from measurements taken during the most recent polar pass of the NOAA POES satellite. "Center time" is the calculated time halfway through the satellite's pass over the pole.

Today's Space Weather

Any threat to communications is always seriously assessed. What we want to see on the other side of the Sun is whether any outburst is coming, that could seriously affect those same communications.

See Also:Backreaction: Reflections on the Sun

Sunday, January 24, 2010

Interplanetary Transport Network




This stylized depiction of the ITN is designed to show its (often convoluted) path through the solar system. The green ribbon represents one path from among the many that are mathematically possible along the surface of the darker green bounding tube. Locations where the ribbon changes direction abruptly represent trajectory changes at Lagrange points, while constricted areas represent locations where objects linger in temporary orbit around a point before continuing on




This book describes a revolutionary new approach to determining low energy routes for spacecraft and comets by exploiting regions in space where motion is very sensitive (or chaotic). It also represents an ideal introductory text to celestial mechanics, dynamical systems, and dynamical astronomy. Bringing together wide-ranging research by others with his own original work, much of it new or previously unpublished, Edward Belbruno argues that regions supporting chaotic motions, termed weak stability boundaries, can be estimated. Although controversial until quite recently, this method was in fact first applied in 1991, when Belbruno used a new route developed from this theory to get a stray Japanese satellite back on course to the moon. This application provided a major verification of his theory, representing the first application of chaos to space travel.

Since that time, the theory has been used in other space missions, and NASA is implementing new applications under Belbruno's direction. The use of invariant manifolds to find low energy orbits is another method here addressed. Recent work on estimating weak stability boundaries and related regions has also given mathematical insight into chaotic motion in the three-body problem. Belbruno further considers different capture and escape mechanisms, and resonance transitions.

Providing a rigorous theoretical framework that incorporates both recent developments such as Aubrey-Mather theory and established fundamentals like Kolmogorov-Arnold-Moser theory, this book represents an indispensable resource for graduate students and researchers in the disciplines concerned as well as practitioners in fields such as aerospace engineering.


See:Interplanetary Superhighway Makes Space Travel Simpler
July 17 2002