Showing posts with label Moore's Law. Show all posts
Showing posts with label Moore's Law. Show all posts

Monday, December 18, 2006

Gottfried Wilhelm von Leibniz

This is a historical reference as well as leading to a conclusion I won't say it for you just that I present the idea, "written word," and then you decide what that message is. You might have thought it disjointed, but it's really not, as you move through it.


Internet Philosphy-Gottfried Wilhelm Leibniz (1646-1716) Metaphysics


There are reasons why this article is being put up, and again, developing a little history to the "line up Lee Smolin prepared" is an important step in discerning why he may have gone down a certain route for comparative relations in terms of "against symmetry."


Click on link Against symmetry (Paris, June 06)

I have no one telling me this, just that any argument had to have it's "foundational logic of approach" and learning to interpret why someone did something, is sometimes just as important as the science they currently pursued, or adopted, in light of other models and methods. It does not necessarily make them right. Just that they are delving in model apprehension and devising the reasons why the model they choose to use, "is" the desired one, from their current philosophical development and understanding.

So they have to present their logic.

The Identity of Indiscernibles

The Identity of Indiscernibles (hereafter called the Principle) is usually formulated as follows: if, for every property F, object x has F if and only if object y has F, then x is identical to y. Or in the notation of symbolic logic:

F(Fx ↔ Fy) → x=y

This formulation of the Principle is equivalent to the Dissimilarity of the Diverse as McTaggart called it, namely: if x and y are distinct then there is at least one property that x has and y does not, or vice versa.

The converse of the Principle, x=y → ∀F(Fx ↔ Fy), is called the Indiscernibility of Identicals. Sometimes the conjunction of both principles, rather than the Principle by itself, is known as Leibniz's Law.


It is almost if the computerize world is to be developed further, "this logic" had to be based on some philosophical approach? Had to be derived from some developmental model beyond the scope of "the approach to quantum gravity" that it had it's basis designed in the area of research, a university could be exploiting itself?


In 1671 Gottfried Wilhelm von Leibniz (1646-1716) invented a calculating machine which was a major advance in mechanical calculating. The Leibniz calculator incorporated a new mechanical feature, the stepped drum — a cylinder bearing nine teeth of different lengths which increase in equal amounts around the drum. Although the Leibniz calculator was not developed for commercial production, the stepped drum principle survived for 300 years and was used in many later calculating systems.


This is not to say the developmental program disavows current research in all areas to be considered. Just that it's approach is based on "some method" that is not easily discernible even to the vast array of scientists current working in so many research fields.

Why Quantum Computers?

On the atomic scale matter obeys the rules of quantum mechanics, which are quite different from the classical rules that determine the properties of conventional logic gates. So if computers are to become smaller in the future, new, quantum technology must replace or supplement what we have now. The point is, however, that quantum technology can offer much more than cramming more and more bits to silicon and multiplying the clock--speed of microprocessors. It can support entirely new kind of computation with qualitatively new algorithms based on quantum principles!


Increasing complexity makes it very hard to describe complex systems and imagine if your were going from the top down, what constituent descriptors of reality we would have to manufacture, if we wanted to speak about all those forms and the complexity that makes up these forms?

Moore's Law

Moore's law is the empirical observation that the complexity of integrated circuits, with respect to minimum component cost, doubles every 24 months[1].

Tuesday, June 06, 2006

Supersymmetry<->Simplistically<-> Entropically Designed?



So of course I am troubled by my inexperience, as well as, the interests of what could have been produced in the "new computers" of the future? So in some weird sense how would you wrap the dynamics of what lead to "Moore's law" and find that this consideration is now in trouble? While having wrapped the "potential chaoticness" in a systemic feature here as deterministic? Is this apporpriate?

In the presence of gravitational field (or, in general, of any potential field) the molecules of gas are acted upon by the gravitational forces. As a result the concentration of gas molecules is not the same at various points of the space and described by Boltzman distribution law:


What happens exponetially in recognizing the avenues first debated between what was a consequence of "two paths," One that would be more then likely "a bizzare" while some would have consider the other, the cathedral? Leftists should not be punished Lubos:)

So what is Chaos then?

The roots of chaos theory date back to about 1900, in the studies of Henri Poincaré on the problem of the motion of three objects in mutual gravitational attraction, the so-called three-body problem. Poincaré found that there can be orbits which are nonperiodic, and yet not forever increasing nor approaching a fixed point. Later studies, also on the topic of nonlinear differential equations, were carried out by G.D. Birkhoff, A.N. Kolmogorov, M.L. Cartwright, J.E. Littlewood, and Stephen Smale. Except for Smale, who was perhaps the first pure mathematician to study nonlinear dynamics, these studies were all directly inspired by physics: the three-body problem in the case of Birkhoff, turbulence and astronomical problems in the case of Kolmogorov, and radio engineering in the case of Cartwright and Littlewood. Although chaotic planetary motion had not been observed, experimentalists had encountered turbulence in fluid motion and nonperiodic oscillation in radio circuits without the benefit of a theory to explain what they were seeing.

13:30 Lecture
Edward Norton Lorenz
Laureate in Basic Sciences
“How Good Can Weather Forecasting Become ? – The Star of a Theory”





So this talk then is taken to "another level" and the distinctions of WeB 2.0 raised it's head, and of course, if you read the exponential growth highlghted in communities desemmination of all information, how could it be only Web 1.0 if held to Netscape design?



I mean definitely, if we were to consider "the Pascalian triangle" and the emergence of the numbered systems, what said the Riemann Hypothesis would not have emerged also? The "marble drop" as some inclusive designation of the development of curves in society, that were once raised from "an idea" drawn, from some place?

Sunday, May 28, 2006

Moore's Law Endangered?

Moore's Law(wikipedia 28 May 2006)

Moore's law is the empirical observation that the complexity of integrated circuits, with respect to minimum component cost, doubles every 24 months[1].


Clifford, in writing the brief article of interest, he relays another article here for consideration.

Spotting the quantum tracks of gravity wavesby Zeeya Merali

Their calculations show that as the gravitational force from a passing wave slightly changes the momentum of the entangled particles, it should knock them out of their pristine spin state. In principle, that effect could be detected, but it is so small that no one has found a way to pick it up, explains Yeo. He and his team suggest that the effect could be amplified using a process called "entanglement swapping", which allows pairs of particles that have never been in contact to become entangled. "Spin and momentum become entangled to a higher degree so that changing one produces an even larger change in the other," says quantum physicist Chris Adami at the Jet Propulsion Laboratory in Pasadena, California.


While it may have been some time that now passes it is worth the mention again that "spintronics" has this role to play, yet, in gravity probe B, the spherical valuations would only now make sense on a large cosmological plate?

So by analogy usng Grvaity probe B we gain perspective onthe relevances of change within that gravitational radiation?

A black hole is an object so massive that even light cannot escape from it. This requires the idea of a gravitational mass for a photon, which then allows the calculation of an escape energy for an object of that mass. When the escape energy is equal to the photon energy, the implication is that the object is a "black hole".


Yet, is is of some concern that when we travel down to such microstates, that we are able in fact to keep a pure and clean picture of what existed once, and had gone through the changes in "spin orientation and momentum?"

If the boundariesof the blackhole are indeed collapsing to supersymmetrcial proportions, then what use photon information if it cannot describe for us something that is going on inside?

#18

The distinction is important, since the term gravity waves is primarily used in fluid dynamics to describe fluid oscillations that have gravity as their restoring force

I noticed link did not work and I was looking for confirmation as to your statement. Not that you need it :)

So just to confirm source, I reiterate it here again. If any a expert, would they like to clean up reference(does it need to be)?

(Gravitational waves are sometimes called gravity waves, but this term should be reserved for a completely different kind of wave encountered in hydrodynamics.)


Also, "the effect" while in the throes of gravity waves just to clarify the thinking(ocean waves and such), effects of Hulse and Taylor different, while the entanglement issue speaks to energy release is defined by photons passage of time as is?

What is the fastest way for it to get here without being influenced. Lagrangian perspective[Edwin F Taylors least Action Principal] and "tunnel transport" and effects of lensing?

Of course thinking about the nature of the types of high energy level photon(gamma) and what they can traverse through, may be confusing, yet distinctive?

One of the physical device limitations described by Dr. Packan is that transistor gates, as further miniaturization is pursued, will become so thin that quantum mechanical “tunneling” effects will arise. These quantum effects will create leakage current through the gate when the switch is “off” that is a significant fraction of the channel current when the device is “on”. This could reduce the reliability of the transistors resulting in increased cost and decreased availability of more powerful chips