Energy Evolution Program

Sunday, February 23, 2014

Revisiting NASA’s "Is Warp Drive Possible?"

Revisiting:  NASA’s Wormhole Warp through curved space, back and forth through time – all nonexistent possibilities


















Is Warp Drive Possible?
NASA researchers have actually been testing actual faster-than-light warp drive practicality. By re-imagining of an Alcubierre Drive, it may eventually result in an engine that can transport a spacecraft to the nearest star in a matter of weeks — and all without violating Einstein's law of relativity.
Star Trek's warp drive may not be restricted to the world of science fiction after all.  A few months ago, physicist Harold White shocked the aeronautics world when he announced that he and his team at NASA had begun work on the development of a warp drive. His proposed design, an ingenious re-imagining of an Alcubierre Drive, may eventually result in an engine that can transport a spacecraft to the nearest star in a matter of weeks. Read more: http://www.33rdsquare.com/2013/06/is-warp-drive-possible.html#ixzz2u5lfkQG2 


The Alcubierre drive 






http://en.wikipedia.org/wiki/Alcubierre_drive The Alcubierre drive or Alcubierre metric (referring to metric tensor) is a speculative idea based on a solution of Einstein's field equations in general relativity as proposed by theoretical physicist Miguel Alcubierre, by which a spacecraft could achieve faster-than-light travel if a configurable energy-density field lower than that of vacuum (i.e. negative mass) could be created. Rather than exceeding the speed of light within its local frame of reference, a spacecraft would traverse distances by contracting space in front of it and expanding space behind it, resulting in effective faster-than-light travel.

“Two Loopholes to alter the properties of space and time”:
·        One is a wormhole
·        The other is a space warp

Non-programmed re-assessment of basic science fundamentals

Non-programmed re-assessment of basic science fundamentals, with critical, creative thought derived from self-discovery are now necessary to update, upgrade and revise physics’ standard model.   The stumbling blocks of “local frame of reference” across extreme scale variances, among all intertwined factors of nature (space-time-mass-matter-energy-gravity), without a common denominator, interjected with erroneously defined complex numbers and the square root of -1 ………………. will not go away using a ‘buggy whip’ tool (from horse and buggy days) to drive a starship.

Not to insult anyone’s intelligence, let us look at a very basic simple concept – going from Chicago to New York at three different speeds, to review and identify the factors in play.











In the walk and flight time sceniaros, distance only appears to have shortened, due to the far less time involved (2nd option), caused by the greater energy differential substituted for the time variable.

With this obvious clue (energy-time substitution) providing insight to the total interdependent relationships between space-time-mass-matter-energy-gravity, we will challenge the reader to do some critical thinking of the factors involved on the third option, Chgo to NY @ VC (velocity of light) speed.

Recognition of what you can or cannot SEE, by science’s  own definition of light, and the fact that the VC energy differential, the quantity C, is the kinetic energy equivalent of the mass energy of matter will go a long ways toward clearing up wormholes, space and time curves and warps, black and white holes, and many other interferences, obstructions, blocking our expanding window to reality. 
















Recall:  The Natural laws of space-time-mass-matter-energy-gravity are not absolute, but relative. That is, the size and shape of the curve of one law is dependent upon the value and position of the others - and /or - the value of one can be altered between any two reference points by altering the value or relationship of the other. In this example, we note time and energy’s relationship variance follows the same curve of natural law which is apparent in the operation of all the basic factors of nature, and again the radius of that curvature is measured by the quantity C.
The natural laws do not follow a straight line reaching to infinity, but a curve of finite radii. In a timeless universe, this curve, in any given plane, would be represented by a circle, but since the laws operate through time as well as space, the curve may be more readily understood if depicted as a "sine curve" or "wave". The "base" line of the wave (which is the center line of the curve) represents zero, and the portions above and below the line represent the positive and negative aspects of the law.  Thus we see that there are points and conditions in which the natural laws reach zero value with respect to a given reference point, and that beyond these points the laws become negative, reversing their effect with respect to the observer.
The "limitations of relativity" will always precede us at a distance equal to the radius of curvature of natural law. We need not fear that we will ever overtake or be hampered in any way by those limitations, since our 'reference point' (of a true coordinate system) will always go along with us. The value of the natural laws (gravitational field included) at any given point is controlled by the values of the other factors of nature at that point. 

Radius of Curvature of all Natural Law:

Tuesday, February 4, 2014

Taking the “Flat World” Limited View Out of the Standard Model of Physics










































Currently, the Standard Model of physics accurately predicts only about 4% of our universe. The other 96% is "missing" and composed of "dark matter" and "dark energy". Perhaps what is missing is the incredibly abundant energy that exists within SPACE itself. It is missing from their equations because of something included in the standard model known as "renormalization" where they effectively swept the infinite density of the vacuum of space under the rug mathematically and then proceeded with their equations as if this energy were not important to include in a theory that is supposed to, by definition, include everything.

When the energy that exists in the vacuum of space itself is properly accounted for then the need to throw in this "missing new type of matter" that was invented out of thin air and added to the standard model to make the model work is no longer necessary. When including the incredible amount of energy present in the fabric of the vacuum itself, then one can calculate that the proton, for example, has enough mass-energy inside of it's volume to create a tiny singularity in its center: a mini black hole.

See Nassim Haramein's paper "The Schwarzschild proton" for the details of this theory that shows that only a very small percentage (~10-39%) of the vacuum fluctuations available within a proton volume need be cohered and converted to mass-energy in order for the proton to meet the Schwarzschild condition of a black hole.  The Schwarzschild proton paper .pdf :
http://hiup.org/wp-content/uploads/2013/05/AIP_CP_SProton_Haramein.pdf


The disappearance act of ROC - radius of curvature of all natural law (1903-1904 E.T. Whittaker), hugely assisted by Heaveside and Gibbs, and subsequently swept under the rug by the continued use of the mathematical renormalization (clown?), helped create today’s contemporary standard model farce of modern physics. 

















On A Testable Unification Of Electromagnetics, General Relativity, And Quantum Mechanics

T.E. Bearden
Association of Distinguished American Scientists
2311 Big Cove Road, Huntsville, Alabama 35801
Walter Rosenthal
4876 Bethany Lane, Santa Maria, California 93455
Copyright © 1991 by T.E. Bearden & W. Rosenthal, All Rights Reserved.

Abstract

Unrecognized for what it was, in 1903-1904 E.T. Whittaker (W) published a fundamental, engineerable theory of electrogravitation (EG) in two profound papers. The first (W-1903) demonstrated a hidden bidirectional EM wave structure in the scalar potential of vacuum, and showed how to produce a standing scalar EM potential wave -- the same wave discovered experimentally four years earlier by Nikola Tesla. W-1903 is a hidden variable theory that shows how to deterministically curve the local and/or distant space-time using EM. W-1904 shows that all force field EM can be replaced by interferometry of two scalar potentials, anticipating the Aharonov-Bohm effect by 55 years and extending it to the engineerable macroscopic world. W-1903 shows how to turn EM into G-potential, curve local and/or distant space-time, and directly engineer the virtual particle flux of vacuum. W-1904 shows how to turn G-potential and curvature of space-time back into force-field EM, even at a distance. The papers implement Sakharov's 1968 statement that gravitation is not a fundamental field of nature, but a conglomerate of other fields. Separately applied to electromagnetics (EM), quantum mechanics (QM), and general relativity (GR), an extended superset of each results. The three supersets are Whittaker-unified, so that a testable, engineerable, unified field theory is generated. EM, QM, and GR each contained a fundamental error that blocked unification, and these three errors are explained. The Schroedinger potential can also be structured and altered, indicating the direct engineering of physical quantum change. Recently Ignatovich has pointed out this hidden bidirectional EM wave structure in the Schroedinger potential, without referencing Whittaker's 1903 discovery of the basic effect. The potential for applying the new approach to explain the nature of mind and thought, and providing a laboratory-testable theory for them, is briefly noted and indicative major references cited. Some of the possible implications for physics and biology are pointed out.

Note the problems/issues regarding scale invariance always appear at extremely large and small scales – differentials approaching, at, and exceeding the equivalent of the QC, velocity of light energy differential 







































Scale Invariance  Thomas E. Phipps, Jr. ... undertook both theoretical studies and various small-scale ... How legitimate is it to treat first-order invariance problems by second ... link here




































This link is not authorized by Yahoo! If you would like to continue to this link's intended destination at your own risk, click here.    

The Farce of Modern Physics

Plasma physicist Eric Lerner says that although the standard model makes valid predictions within broad limits of accuracy, it has no practical application beyond justifying the construction of ever-larger particle accelerators http://www.davidpratt.info/farce.htm

Excerpts
·        Note the scale conditions when the flip goes from positive attraction to negative;  Note also, Heaveside preferred complex numbers and the square root of -1, rather than deal with the QC as the radius of curvature of all natural law.
·        The weak nuclear force is a very curious type of ‘force’. Many orders of magnitude weaker than the electromagnetic force, it is responsible for radioactivity and hydrogen fusion, and supposedly converts neutrons into protons by tampering with quarks. The strong nuclear force between neutrons and protons is also very peculiar. Up to a distance of around 10-15 m (1 fermi), it is very strongly repulsive, keeping nucleons apart. Then, for unknown reasons, it abruptly becomes very strongly attractive, before dropping off very rapidly. Current theory claims that this somehow results from the inter-quark gluon force ‘leaking’ out of the nucleon. Obviously if quarks don’t exist no force is required to hold them together. As for the force holding protons and neutrons together, some alternative theories argue that there are no neutrons in the atomic nucleus, only positive and negative charges held together by ordinary electrostatic forces.10
·        ‘the Maxwell theory fails many experimental tests and has only a limited range of validity ... The fanatical belief in the validity of Maxwell’s theory for all situations, like the fanatical belief in “special relativity”, which is purported to be confirmed by the Maxwell theory, continues to hamper the progress in physics.’1 Pioneering scientists and inventors Paulo and Alexandra Correa show that Maxwell’s fundamental errors included the wrong dimensional units for magnetic and electric fields and for current – ‘two epochal errors now reproduced for over a century, and which have done much to arrest the development of field theory’.2
·        Meanwhile, symmetry had been superseded by supersymmetry (or SUSY), which is rooted in the concept of spin. The basic idea is that matter particles (fermions) and force-carrier particles (bosons) are not really two different kinds of particles, but one. Each elementary fermion is assumed to have a boson superpartner with identical properties except for mass and spin. For each quark there is a squark, for each lepton a slepton, for each gluon a gluino, for each photon a photino, etc. In addition, for the bosonic Higgs field it is necessary to postulate a second set of Higgs fields with a second set of superpartners.
·        A major problem is that these new particles (known as ‘sparticles’) cannot have the same masses as the particles already known, otherwise they would have already been observed; they must be so heavy that they could not have been produced by current accelerators. This means that supersymmetry must be a spontaneously broken symmetry, and this is said to be a disaster for the supersymmetric project as it would require a vast array of new particles and forces on top of the new ones that come from supersymmetry itself. This completely destroys the ability of the theory to predict anything.
·        The reigning theory is that at the moment of the big bang the entire universe – including space itself – exploded into being out of nowhere in a random quantum fluctuation. Before it started to expand, it measured just 10-33 cm across, and had infinite temperature and density. The main piece of evidence for this fairytale is that ‘space is expanding’. But no one has ever directly measured any expansion of space. The standard view is that space does not in fact expand within our own solar system or galaxy or within our local group of galaxies or even our own cluster of galaxies; instead, it only expands between clusters of galaxies – where, conveniently, there is no earthly chance of making any direct observations to confirm or refute it. Since space is surely infinite, how can it get any bigger?
·        As already mentioned, some physicists speak of a ‘quantum ether’. This refers to two things: 1) the zero-point field (ZPF), i.e. fluctuating electromagnetic radiation fields produced by random quantum fluctuations that, according to quantum theory, persist even at a temperature of absolute zero (-273°C); 2) innumerable pairs of short-lived ‘virtual’ particles (such as electrons and positrons), sometimes called the ‘Dirac sea’. Formally, every point of space should contain an infinite amount of zero-point energy. By assuming a minimum wavelength of electromagnetic vibrations, the energy density of the ‘quantum vacuum’ has been reduced to the still astronomical figure of 10108 joules per cubic centimetre.
·        Although various experimental results are widely interpreted as consistent with the existence of zero-point energy, further work is needed to test the theory and alternative explanations. Some scientists have theorized that mass, inertia, and gravity are all connected with the fluctuating electromagnetic energy of the ZPF. However, the ZPF itself is usually regarded as the product of matter-energy, which supposedly originated in the ‘big bang’, whereas modern ether theories generally hold that physical matter crystallizes out of or dissolves back into the preexisting ether. At present the only verified all-pervasive electromagnetic energy field is the cosmic microwave background radiation, which is commonly hailed as the afterglow of the big bang, but is also explicable as the temperature of space, or rather of the ether.3
·        Paul LaViolette has developed a theory known as ‘subquantum kinetics’, which replaces the 19th-century concept of a mechanical, inert ether with that of a continuously transmuting ether.4 Physical subatomic particles and energy quanta are pictured as wavelike or vortex-like concentration patterns in the ether. A particle’s gravitational and electromagnetic fields are said to result from the fluxes of different kinds of etheric particles, or etherons, across their boundaries, and the associated etheron concentration gradients.

·        LaViolette believes that an etheric subatomic particle might resemble the vorticular structures that theosophists Annie Besant and Charles Leadbeater observed during their clairvoyant examination of atoms from 1895 to 1933. They called these objects ‘ultimate physical atoms’ (UPAs), which they considered to be the basic unit of physical matter, existing on the seventh and highest (‘atomic’) subplane of our physical plane; they said that any effort to dissociate a UPA further caused it to disappear from our own plane of reality.5

MOVING BACK TO NORMAL (BUT NOT RENORMALIZATION)

CONSIDERATION OF THE LARGER FUNDAMENTAL PROPERTIES OF LIGHT AND ITS RELATIVES: SPACE TIME MASS MATTER ENERGY GRAVITY/(FIELDS)

Radius of Curvature of all Natural Law:

·        Light: 186,000 miles per second (energy differential)
·        Light2: The Radius of Curvature of all Natural Law – a sine wave, positive/negative, characterizing the  nature of every natural law
·        Light3: The kinetic energy equivalent of the mass energy of matter
·        Light4: The Big Blink Inward - Gravitational, 186,000 per sec; Outward - Radiation, 186,000 per second
·        Light5:   Haramein / Rauscher’s (reference below) derivation that protons orbiting the nucleus of an atom at the speed of light in a vacuum are essentially a black hole
 
The Resonance Project
The universe has a fundamental structure (geometry) and a fundamental dynamic (spin). Here is an excellent image to show how these two fundamental principals interact. If you make two phi spirals out from a central point, each in opposite directions, their intersecting points outline the nodal points of a star tetrahedron, a geometry that is the seed of the fabric of the vacuum: an infinite
t etrahedral array. 



















8 star-tetrahedrons put together creates a 64 tetrahedron grid, forming two octaves of what Buckminster Fuller called the "vector-equilibrium", otherwise known as the cube octahedron, that are nested inside each-other. 64 is fewest number of tetrahedrons you need to begin to see what is an infinite scalar fractal geometry that is the underlying geometric structure of the fabric of the entire universe.

The Resonance Project
  ~ Nassim Haramein ~ Fractal Enlightenment