Thursday, August 30, 2012

Radiation Belt Storms Probes Launched

 NASA hosted a two-day event for 50 social media followers on August 22-23, 2012, at NASA's Kennedy Space Center in Florida. NASA's twin Radiation Belt Storm Probes (RBSP) are scheduled to lift off aboard a United Launch Alliance Atlas V rocket at 4:08 a.m. on August 23. Designed for a two-year primary science mission in orbit around Earth, RBSP will provide insight into our planet's radiation belts, and help scientists predict changes in this critical region of space. 

NASA's Radiation Belt Storm Probes blasted off from Cape Canaveral on August 30th, 2012. Bristling with sensors, the heavily-shielded spacecraft are on a 2-year mission to discover what makes the radiation belts so dangerous and so devilishly unpredictable.
"We've known about the Van Allen Belts for decades yet they continue to surprise us with unexpected storms of 'killer electrons' and other phenomena," says mission scientist David Sibeck, "The Storm Probes will help us understand what's going on out there." 

RBSP (instruments, 200px)

Each of the two Storm Probes is bristling with sensors to count energetic particles, measure plasma waves, and detect electromagnetic radiation. Learn more
See: The Radiation Belt Storm Probes

See also

Tuesday, August 28, 2012

Grail At the Moon

 Grail Recovery and Interior Labratory
NASA's Gravity Recovery And Interior Laboratory (GRAIL)-A spacecraft successfully completed its planned main engine burn at 2 p.m. PST (5 p.m. EST) today. As of 3 p.m. PST (6 p.m. EST), GRAIL-A is in a 56-mile (90-kilometer) by 5,197-mile (8,363-kilometer) orbit around the moon that takes approximately 11.5 hours to complete.

Visualisation of the “Geoid” of the Moon

Sunday, August 26, 2012

Radiation Belt Storm Probes (RBSP)

The launch of an Atlas V carrying NASA's Radiation Belt Storm Probes (RBSP) payload was scrubbed today due to weather conditions associated with lightning, as well as cumulus and anvil clouds. With the unfavorable weather forecast as a result of Tropical Storm Isaac, the leadership team has decided to roll the Atlas V vehicle back to the Vertical Integration Facility to ensure the launch vehicle and twin RBSP spacecraft are secured and protected from inclement weather. Pending approval from the range, the launch is rescheduled to Thursday, Aug. 30 at 4:05 a.m. Eastern Daylight Time. SeeRBSP Launch Targeted for No Earlier Than Aug. 30

RBSP is being designed to help us understand the Sun’s influence on Earth and Near-Earth space by studying the Earth’s radiation belts on various scales of space and time. 

The instruments on NASA’s Living With a Star Program’s (LWS) Radiation Belt Storm Probes (RBSP) mission will provide the measurements needed to characterize and quantify the plasma processes that produce very energetic ions and relativistic electrons. The RBSP mission is part of the broader LWS program whose missions were conceived to explore fundamental processes that operate throughout the solar system and in particular those that generate hazardous space weather effects in the vicinity of Earth and phenomena that could impact solar system exploration. RBSP instruments will measure the properties of charged particles that comprise the Earth’s radiation belts, the plasma waves that interact with them, the large-scale electric fields that transport them, and the particle-guiding magnetic field. 

The two RBSP spacecraft will have nearly identical eccentric orbits. The orbits cover the entire radiation belt region and the two spacecraft lap each other several times over the course of the mission. The RBSP in situ measurements discriminate between spatial and temporal effects, and compare the effects of various proposed mechanisms for charged particle acceleration and loss. See: RBSP

Credit: NASA/Johns Hopkins University Applied Physics Laboratory
Engineers at the Johns Hopkins University Applied Physics Laboratory in Laurel, Md., prepare to place Radiation Belt Storm Probes spacecraft "B" in a thermal-vacuum chamber, where they can make sure the propulsion system will stand up to the range of hot, cold and airless conditions RBSP will face in outer space. This round of testing took place in late October-early November 2010.

See Also:

Saturday, August 25, 2012


SAMPEX, the Solar Anomalous and Magnetospheric Particle Explorer, was successfully launched by a Scout rocket on July 3, 1992. It is investigating the composition of local interstellar matter and solar material and the transport of magnetospheric charged particles into the Earth's atmosphere.

SAMPEX is a momentum-biased, sun-pointed spacecraft that maintains the experiment-view axis in a zenith direction as much as possible, especially while traversing the polar regions of the Earth. It points its solar array at the Sun by aiming the momentum vector toward the Sun and rotating the spacecraft one revolution per orbit about the Sun/spacecraft axis.

The Solar Anomalous and Magnetospheric Particle Explorer (SAMPEX) satellite was launched in July 1992 into a low earth orbit at an altitude of 520 by 670 km and 82 degrees inclination. The satellite far exceeded its expected three-year lifetime. It has primarily operated in a three-axis stabilized mode but has also been spun for limited periods. The satellite carries four instruments designed to measure the radiation environment of the Earth's magnetosphere.

SAMPEX was an international collaboration between NASA of the United States and Germany.[2] It was part of the Small Explorer program started in 1989[2]
SAMPEX science mission ended on June 30, 2004.[3]

The Crown of the Creation Syndrome

Innumerable suns exist; innumerable earths revolve around these suns in a manner similar to the way the seven planets revolve around our sun. Living beings inhabit these worlds. Giordano Bruno, 1584

The paper was very much appreciated by many of the author's colleagues[citation needed], even by very prominent ones, although it has also been criticized as being unscientific[citation needed], belonging more appropriately to the category of science fiction, by several other colleagues[citation needed]. The reason for this discrepancy, she says, is due solely to prejudice (similar to the prejudices regarding the biological evolution discovered by Darwin and his colleagues). As a matter of fact, Gato-Rivera even coined the term the Crown of the Creation Syndrome[citation needed], in her paper to explain this kind of prejudices, which she discusses in some detail.

 Beams of neutrinos have been proposed as a vehicle for communications under unusual circumstances, such as direct point-to-point global communication, communication with submarines, secure communications and interstellar communication. We report on the performance of a low-rate communications link established using the NuMI beam line and the MINERvA detector at Fermilab. The link achieved a decoded data rate of 0.1 bits/sec with a bit error rate of 1% over a distance of 1.035 km, including 240 m of earth.

Kapusta points out that the condensation temperature would be well below the cosmic background temperature, so it would be quite a feat to make this superfluid. However, Kapusta also notes that a sufficiently advanced civilization might use pulses of neutrino superfluid for long-distance communications. See: The right spin for a neutrino superfluid


Wednesday, August 22, 2012

Fractals and Antennas and The Economy

 The intuitive framework has to recognize that you have already worked the angles and that such intuition is gathered from all that has been worked. This contradicts what you are saying. I am not saying it is right just that I have seen this perspective in development with regard to scientists as they push through the wall that has separated them from moving on. This then details a whole set of new parameters in which the thinking mind can move forward with proposals.

 I never quite could get the economy either, until I understood the idea of Fractals as a gesture of the underlying pattern of all of the economy in expression. Of course that is my point of view.  I might of called it the algorithm before.

 The idea here is that all thing are expression of the underlying pattern and you might call the end result psychology or sociology of thinking and life as a result.

 It seems that the accumulated reference of mind as a place in it's evolution is to see that all the statistical information is already parametrized by the judgements in which you give them personally?

 Ultimately this is the setting for which your conclusions guide your perspective, yet, it is when we look back, one can choose too, "guide their brain?"

If you did not pick it up, Benoit was able to reduce the economy too, and used an inductive deductive facility in regard to what is self evident. But I would point out what you might have interpreted as illusory in terms of the graph he sees on the board was instrumental to his penetrating the pattern in the economy.

Just raising the name of Nassim Nicholas Taleb and the idea of the Black Swan in relation to the basis of the economy Benoit raises deeper questions and does garner a look for me. I don't know what to expects is opening up the door to understanding more about such erudition's but they are with regard to the economy.

 Taleb was collaborating with Benoit Mandelbrot on a general theory of risk management Collaborations

A simple assumption about heads and tails, leads to bell curves and such?

Taleb, N. N. (2008) Edge article: Real Life is Not a Casino

So you are looking at both sides of the coin.

More on the Black Swan here.

 While these writings are disparate pieces, do they indeed come together under this post book review?? As a scientist and mathematics person are you not intrigued about "the pattern?" I was shocked.....yet is made sense.

Now Nassim adds dimension to the subject. "He calls for cancellation of the Nobel Memorial Prize in Economics, saying that the damage from economic theories can be devastating".

Game theory if you know how it works it is used in all types of negotiation.

See Also:

Thursday, August 16, 2012

Sarah Parcak: Archeology from space and more

 Sarah Parcak: Archeology from space

 Sarah Parcak is an archaeologist and Egyptologist, and specializes in making the invisible past visible using 21st-century satellite technology. She co-directs the Survey and Excavation Projects in the Fayoum, Sinai, and Egypt's East Delta with her husband, Dr. Greg Mumford. Parcak is the author of Satellite Remote Sensing for Archaeology, the first methods book on satellite archaeology, and her work has seeded several TV documentaries. She founded and directs the Laboratory for Global Observation at the University of Alabama at Birmingham.

While most Google Earth hobbyists are satisfied with a bit of snapping and geotagging, some have far loftier ambitions. Satellite archaeologist Angela Micol thinks she's discovered the locations of some of Egypt's lost pyramids, buried for centuries under the earth, including a three-in-a-line arrangement similar to those on the Giza Plateau. Egyptologists have already confirmed that the secret locations are undiscovered, so now it's down to scientists in the field to determine if it's worth calling the diggers in.

Tuesday, August 14, 2012

Worldwide LHC Computing

Grid Cafe

 Individual computers also become more powerful, which means that computer grids are increasingly able to solve increasingly complex problems. All this computing power helps our scientists find solutions to the big questions, like climate change and sustainable power.

The mission of the WLCG project is to provide global computing resources to store, distribute and analyse the ~25 Petabytes (25 million Gigabytes) of data annually generated by the Large Hadron Collider (LHC) at CERN on the Franco-Swiss border.

Current WLCG sites

Architecture of Trigger System

Block diagram of front-end electronics and its interface to Trigger, DAQ and Detector Control System.
Detailed descriptions of the front-end architecture can be found in the following LHCb notes:
EDMS715154, Requirements to the L1 front-end electronics.
LHCb-2001-014, Requirements to the L0 front-end electronics.
EDMS 692583, Test, time alignment, calibration and monitoring in the LHCb front-end electronics.

The HLT (High Level Trigger) have access to all data. At the 1 MHz output rate of Level-0 the remaining analogue data is digitized and all data is stored for the time needed to process the Level algorithm. This algorithm is implemented on a online trigger farm composed of up to 2000 PCs.

The HLT algorithm is divided in two sequential phases called HLT1 and HLT2. HLT1 applies a progressive, partial reconstruction seeded by the L0 candidates. Different reconstruction sequences (called alleys) with different algorithms and selection cuts are applied according to the L0 candidate type. The HLT run very complex physics tests to look for specific signatures, for instance matching tracks to hits in the muon chambers, or spotting photons through their high energy but lack of charge. Overall, from every one hundred thousand events per second they select just dizaines of events and the remaining dizaines of thousands are thrown out. We are left with only the collision events that might teach us something new about physics.

 With this telescope, Jill’s vision, and the power of open-source initiatives, we were able to globalize the search for extraterrestrial intelligence. Because we don’t know what a new signal will look like, it’s hard to create an algorithm to find it, and our own eyes actually work better than computers.See:The search for cosmic company goes on

  • Einstein@Home

  • LIGO:

  • See:

    Friday, August 10, 2012

    Perseid Meteor Shower

    Visit for more.

    The Perseid meteor shower is underway. There's more to see than meteors, however, when the shower peaks on August 11th through 13th. The brightest planets in the solar system are lining up in the middle of the display.

    Thursday, August 09, 2012

    Mechanical Converted Sounds of Operation

    MSL Curiosity's Alpha Particle X-ray Spectrometer, with a ruler
    • Alpha-particle X-ray spectrometer (APXS): This device can irradiate samples with alpha particles and map the spectra of X-rays that are re-emitted for determining the elemental composition of samples.

    Wednesday, August 08, 2012

    Sphere and Sound Waves

    Don demonstrates water oscillations on a speaker in microgravity, and ZZ Top rocks the boat 250 miles above Earth.Science off the Sphere: Space Soundwaves
    So of course I might wonder about cymatics in space. It 's more the idea that you could further experiment with the environment with which life on the space station may provide in opportunity. That's all.:)

    There is a reason why I am presenting this blog entry.

    It has to do with a comparison that came to mind about our earth and the relationship we might see to a sphere of water. Most will know from my blog the relevant topic used in terms of Isostatic adjustment in terms of planet design and formation. It is also about gravity and elemental consideration in terms of the shape of the planet.

    Now sure we can expect certain things from the space environment in terms of molecular arrangement but of course my views are going much deeper in terms of the makeup of that space given the constituents of early universe formations.  So here given to states for examination I had an insight in terms of how one may arrange modularization in terms of using the space environment to capitalize.

    So there is something forming in mind here about the inherent nature of the matter constituents that I may say deeper then the design itself such arrangements are predestined to become perfectly arranged according to the type of element associated with it?

     I want to be in control of that given a cloud of all constituents so that I may choose how to arrange the mattered state of existence. A planet maker perhaps?:) Design the gravity field. There are reasons for this.

    Image: NASA/JPL-
    Planets are round because their gravitational field acts as though it originates from the center of the body and pulls everything toward it. With its large body and internal heating from radioactive elements, a planet behaves like a fluid, and over long periods of time succumbs to the gravitational pull from its center of gravity. The only way to get all the mass as close to planet's center of gravity as possible is to form a sphere. The technical name for this process is "isostatic adjustment."

    With much smaller bodies, such as the 20-kilometer asteroids we have seen in recent spacecraft images, the gravitational pull is too weak to overcome the asteroid's mechanical strength. As a result, these bodies do not form spheres. Rather they maintain irregular, fragmentary shapes.

    See Also:

    Tuesday, August 07, 2012

    Antennae Starwave Formation

    Supernova explosions are enriching the intergalactic gas with elements like oxygen, iron, and silicon that will be incorporated into new generations of stars and planets X-ray: NASA/CXC/SAO/J.DePasquale; IR: NASA/JPL-Caltech; Optical: NASA/STScI

    A beautiful new image of two colliding galaxies has been released by NASA's Great Observatories. The Antennae galaxies, located about 62 million light years from Earth, are shown in this composite image from the Chandra X-ray Observatory (blue), the Hubble Space Telescope (gold and brown), and the Spitzer Space Telescope (red). The Antennae galaxies take their name from the long antenna-like "arms," seen in wide-angle views of the system. These features were produced by tidal forces generated in the collision. See: Antennae: A Galactic Spectacle

    Info From PI-CMS Workshop on LHC and More

    A slice of the CMS detector.

    Broadly speaking, the aim of the talk is to give the theorists in the audience an
    introduction to state-of-the-art reconstruction (e.g. particle flow, techniques for dealing with high pile-up, the status of tau reconstruction) and their implications for searches. A discussion of triggering (whether focused on hadronic or more general) would also be very useful. Beyond these vague suggestions, you can define the scope of the talk however you think will do best to motivate, focus, and inform discussions about possible future analyses. The theorists in the audience will have a mix of BSM and SM expertise, and a somewhat more appetite than average for experimental details. See:
    Jet Reconstruction and Triggering

    A block diagram of the CMS L1 trigger

     In particle physics, a trigger is a system that uses simple criteria to rapidly decide which events in a particle detector to keep when only a small fraction of the total can be recorded. Trigger systems are necessary due to real-world limitations in data storage capacity and rates. Since experiments are typically searching for "interesting" events (such as decays of rare particles) that occur at a relatively low rate, trigger systems are used to identify the events that should be recorded for later analysis. Current accelerators have event rates greater than 1 MHz and trigger rates that can be below 10 Hz. The ratio of the trigger rate to the event rate is referred to as the selectivity of the trigger. For example, the Large Hadron Collider has an event rate of 1 GHz (109 Hz), and the Higgs boson is expected to be produced there at a rate of at least 0.01 Hz. Therefore the minimum selectivity required is 10−11.Taking A Closer Look

    Trigger system

    To have a good chance of producing a rare particle, such as a Higgs boson, a very large number of collisions are required. Most collision events in the detector are "soft" and do not produce interesting effects. The amount of raw data from each crossing is approximately 1 MB, which at the 40 MHz crossing rate would result in 40 TB of data a second, an amount that the experiment cannot hope to store or even process properly. The trigger system reduces the rate of interesting events down to a manageable 100 per second.

    To accomplish this, a series of "trigger" stages are employed. All the data from each crossing is held in buffers within the detector while a small amount of key information is used to perform a fast, approximate calculation to identify features of interest such as high energy jets, muons or missing energy. This "Level 1" calculation is completed in around 1 µs, and event rate is reduced by a factor of about thousand down to 50 kHz. All these calculations are done on fast, custom hardware using reprogrammable FPGAs.

    If an event is passed by the Level 1 trigger all the data still buffered in the detector is sent over
    fibre-optic links to the "High Level" trigger, which is software (mainly written in C++) running on ordinary computer servers. The lower event rate in the High Level trigger allows time for much more detailed analysis of the event to be done than in the Level 1 trigger. The High Level trigger reduces the event rate by a further factor of about a thousand down to around 100 events per second. These are then stored on tape for future analysis.

    See Also: