Thursday, November 20, 2014

Naturalness 2014-Weizmann Institute of Science

Information about the event was blogged by Professor Matt Strassler at, At the Naturalness 2014 Conference. See also his explanation on Naturalness and the Standard Model

Information about the event itself.

The discovery of a Higgs boson, with a mass around 125 GeV, at the LHC is a great victory for the Standard Model (SM). With its minimal scalar sector of electroweak symmetry breaking, the SM at short distances, well below the proton radius, is a complete weakly coupled theory. Even though the SM cannot explain several experimental observations such as neutrino masses, the baryon asymmetry of the universe and the origin of dark matter, one cannot deduce an energy scale at which the SM would be forced to be extended (with the exceptions of the Planck scale and the Landau pole of the hypercharge force). See: Naturalness 2014
***

 
In physics, naturalness is the property that the free parameters or physical constants appearing in a physical theory should take relative values "of order 1". That is, a natural theory would have parameters with values like 2.34 rather than 234,000 or 0.000234. This is in contrast to current theory like the standard model, where there are a number of parameters that vary by many orders of magnitude, and require extensive "fine-tuning" of those values in order for the theory to predict a universe like the one we live in.

The requirement that satisfactory theories should be "natural" in this sense is a current of thought initiated around the 1960s in particle physics. It is an aesthetic criterion, not a physical one, that arises from the seeming non-naturalness of the standard model and the broader topics of the hierarchy problem, fine-tuning, and the anthropic principle.

It is not always compatible with Occam's razor, since many instances of "natural" theories have more parameters than "fine-tuned" theories such as the Standard Model.
 ***

  Now that naturalism has become an accepted component of philosophy, there has recently been interest in Kuhn's work in the light of developments in the relevant sciences, many of which provide corroboration for Kuhn's claim that science is driven by relations of perceived similarity and analogy. It may yet be that a characteristically Kuhnian thesis will play a prominent part in our understanding of science http://plato.stanford.edu/entries/thomas-kuhn/
***

A non-technical discussion of the naturalness criterion and its implications for new physics searches at the LHC. To be published in the book "LHC Perspectives", edited by G. Kane and A. Pierce. See: Naturally Speaking: The Naturalness Criterion and Physics at the LHC
(PDF)

Tuesday, November 18, 2014

Nima Arkani-Hamed Public Lecture: Quantum Mechanics and Spacetime in the 21st Century



Dr. Nima Arkani-Hamed (Perimeter Institute and Institute for Advanced Study) delivers the second lecture of the 2014/15 Perimeter Institute Public Lecture Series, in Waterloo, Ontario, Canada. Held at Perimeter Institute and webcast live worldwide on Nov. 6, 2014, Arkani-Hamed's lecture explores the exciting concepts of quantum mechanics and spacetime, and how our evolving understanding of their importance in fundamental physics will shape the field in the 21st Century. Perimeter Institute Public Lectures are held in the first week of each month. More information on Perimeter Public Lectures: http://ow.ly/DCYPc

Saturday, November 01, 2014

Consciousness as a Derivative of Reductionism?

The idea of a smallest length fell in line with the question of a measure as a derivative of reductionism with regard to consciousness. Going to length, the idea of a fundamental reality works its way into how a approach to consciousness is sought as some fundamental unit.
 The science and history of the minimal length has now been covered in a recent book by Amit Hagar:

Amit is a philosopher but he certainly knows his math and physics. Indeed, I suspect the book would be quite hard to understand for a reader without at least some background knowledge in math and physics. Amit has made a considerable effort to address the topic of a fundamental length from as many perspectives as possible, and he covers a lot of scientific history and philosophical considerations that I had not previously been aware of. The book is also noteworthy for including a chapter on quantum gravity phenomenology. See:Backreaction
Is there a smallest length?

The basis of this examination is to deduce whether or not there are fundamental units with regard to consciousness. This topic as it began as a question had been begun on a different forum in order to develope insight to this very question.

As an INTJ personality I have little patience for those who might get in the way of what I am seeking to accomplish.

Materialism is a form of philosophical monism which holds that matter is the fundamental substance in nature, and that all phenomena, including mental phenomena and consciousness, are the result of material interactions.
Materialism is closely related to physicalism; the view that all that exists is ultimately physical. Philosophical physicalism has evolved from materialism with the discoveries of the physical sciences to incorporate far more sophisticated notions of physicality than mere ordinary matter, such as: spacetimephysical energies and forcesdark matter, and so on. Thus the term "physicalism" is preferable over "materialism", while others use the terms as if they are synonymous.

I sought to explain that there is a limit with which the word measure could be applied and going toward a historical explanation of this demonstrates the responsibility that was put forth in this effort.
"I regard consciousness as fundamental. I regard matter as a derivative of consciousness. We cannot get behind consciousness. Everything we talk about, everything that we regard as existing, postulates consciousness. Max Planck

So insightful to the progress is and was to identify a means with which consciousness "as experience" may have had some definition as to "being" the reducible element. So quickly cutting to the heart of the issue is whether measure could have been explained as a function? IN this case as said experience become that factor.

 I suggest that a theory of consciousness should take experience as fundamental. We know that a theory of consciousness requires the addition of something fundamental to our ontology, as everything in physical theory is compatible with the absence of consciousness. We might add some entirely new nonphysical feature, from which experience can be derived, but it is hard to see what such a feature would be like. More likely, we will take experience itself as a fundamental feature of the world, alongside mass, charge, and space-time. If we take experience as fundamental, then we can go about the business of constructing a theory of experience.
Nonreductive explanation-Facing Up to the Problem of Consciousness

To complicate the matters then is to say that while experience is fundamental, this then refers to some fundamental unit of expression? Experience had to be made up of something much more intricate as to suggest that given the length and determination of that measure which no longer exists, with regard to that length asks that the element still seeks to be expressed as a fact of consciousness?