Process Physics

TRANSMODERN ALCHEMY * Book Blurb * THE TRANSMODERN ALCHEMIST * Preface * Introduction * Art Manifesto * Chaos Naturae * Lumen Naturae * Chaos, Solvent & Stone * The Chaotic Sea * Chaotic Consciousness * Pattern & Process * Chaos As the Universal Solvent * Transmanifesto * Psychophysics * Psychological Alchemy * Upwelling Healing * The Homunculus * Fierce Luminosity * Soma Sophia * As Above; So Below * Whole Sum Infinity * Anima Mundi * PART II - FRONTIER SCIENCE THEORIES * Consciousness Theories * Many Worlds / Many Theories * Process Physics * Scalar Physics * Microphysics * Multiverse * Mirror Matter / Antimatter / Strange Matter * Author Iona Miller * Alchemist's Blog * Photo 2 *

PROCESS PHYSICS: CHAOS IS THE PRIMA MATERIA
Reality has the form of a self-referential order-disorder information system

 

Monday, January 18, 2010

Deriving the Properties of the Universe

The properties of the Universe can be derived by thinking about the origin of complexity, says a new theory

Physicists and cosmologists have long noted that the laws of physics seem remarkably well tuned to allow the existence of life, an idea known as the anthropic principle.

It is sometimes used to explain why the laws of physics are the way they are. Answer: because if they were different, we wouldn't be here to see them.

To many people, that looks like a cop out. One problem is that this way of thinking is clearly biased towards a certain kind of carbon-based life that has evolved on a pale blue dot in an unremarkable corner of the cosmos. Surely there is a more objective way to explain the laws of physics.

Enter Raphael Bousso and Roni Harnik at the University of California, Berkeley and Stanford University respectively. They point out that the increase in entropy in any part of the Universe is a decent measure of the complexity that exists there. Perhaps the anthropic principle can be replaced with an entropic one?

Today, they outline their idea and it makes a fascinating read. By thinking about the way entropy increases, Bousso and Harnik derive the properties of an average Universe in which the complexity has risen to a level where observers would have evolved to witness it.

They make six predictions about such a Universe. They say "typical observers fi nd themselves in a flat universe, at the onset of vacuum domination, surrounded by a recently produced bath of relativistic quanta.These quanta are neither very dilute nor condensed, and thus appear as a roughly thermal background."

Sound familiar? It so happens that we live in a (seemingly) flat universe, not so long after it has become largely a vacuum and we're bathed in photons that form a thermal background. That's the cosmic infrared background that is emitted by galactic dust heated by starlight (this is different from the cosmic microwave background which has a different origin).

That's a remarkably accurate set of predictions from a very general principle. The question, of course, is how far can you run with a theory like this.

It certainly has the feel of a powerful idea. But, just like the anthropic principle, it also has the scent of circular reasoning about it: the universe is the way it is because if it were different, the complexity necessary to observe it wouldn't be here to see it.

That may not be so hard to stomach, given the power of the new idea. Even a hardened physicist would have to accept.that Bousso and Harnik have a remarkably elegant way of capturing the state of the universe.

Ref:arxiv.org/abs/1001.1155: The Entropic Landscape

 

 

 

Random Process

Science itself is a questioning process. In science, a process is every sequence of changes of a real object/body which is observable using scientific method. Therefore, all sciences analyze and model processes. Processes are always properties of dynamic systems, characterized by such system attributes as variables and parameters. Every process model has distinguished input and output variables, that can be autonomous or controlled.

The recognition of a process is an arbitrary subjective mental operation/event because it depends on different circumstances, the observer's goal, perception and conceptualization tools. There are numerous taxonomies of processes with different qualities: continuous and discrete, stable and not stable, convergent or not convergent, cyclic and not cyclic, linear and not linear. They are also grouped according to the name of the domain where they are analyzed.

Chaos science is never independent of the mythos of the Western tradition of chaos and order. Cartesian views have been resequenced, placing chaos above and below order, enlivening the mechanistic view. Changed assumptions are embedded in this resequencing that harkens back to ancient alchemical notions of primordial chaos, reprioritizing organic models. Microcosm and macrocosm are not causes but domains of existence.

A random or stochastic process is the counterpart to a deterministic process. It's a different view of 'reality' and how the process might evolve through time. In random process there is some indeterminacy in future evolution described by probability distributions. Even if the initial condition (or starting point) is known, there are many possibilities, but some paths are more probable and others less.

A random process is a sequence of random variables known as a time series. Another basic type of a stochastic process is a random field, whose domain is a region of space. In other words, in a random function arguments are drawn from a range of continuously changing values. Inputs are random variables. Hidden fractal order spans all domains but was not mathematically describable until the computer age. There are certain clues that chaos underlies a process.

Process Universe

New models arise to correct the errors and fill in the gaps in conventional science. Process-oriented models are replacing static models of reality. Process physics is one such proposal that has many merits, since it describes an experiential 4-d universe that agrees with current experimental evidence. It does not require dark matter or energy, nor the multiple dimensions of string or M-Theory. It's main proponent is Australian researcher Reginald Cahill.

Process physics claims to be a model of reality that is designed to replace general relativity and unify it with quantum theory. The limitations of formal information systems discovered by Gödel, Turing and Chaitin, are used to replace the geometric modeling of time constructed by Galileo, Newton and Einstein, and to account for the measurement process in quantum theory. Process Physics is distinguished by modelling time as process. The ongoing failure of physics to fully match all the aspects of the phenomena of time, apart from that of order, arises because physics has always used non-process models, as is the nature of formal or syntactical systems. Such systems do not require any notion of process - they are entirely structural and static. The new process physics overcomes these deficiencies by using a non-geometric process model for time, but process physics also argues for the importance of relational or semantic information in modelling reality. Semantic information is information that is generated and recognised within the system itself.

"In Process Physics we start from the premise that the limits to logic, which are implied by Gödel's incompleteness theorems, mean that any attempt to model reality via a formal system is doomed to failure. Instead of formal systems we use a process system, which uses the notions of self-referential information with self-referential noise and self-organised criticality to create a new type of information-theoretic system that is realising both the current formal physical modelling of reality but is also exhibiting features such as the direction of time, the present moment effect and quantum state entanglement (including EPR effects, nonlocality and contextuality), as well as the more familiar formalisms of Relativity and Quantum Mechanics. In particular a theory of Gravity has already emerged.

In short, rather than the static 4-dimensional modelling of present day (non-process) physics, Process Physics is providing a dynamic model where space and matter are seen to emerge from a fundamentally random but self-organising system. The key insight is that to adequately model reality we must move on from the traditional non-process syntactical information modelling to a process semantic information modelling; such information is `internally meaningful'.

The new theory of gravity which has emerged from Process Physics is in agreement with all experiments and observations. This theory has two gravitational constants: G, the Newtonian gravitational constant, and a second dimensionless constant which experiment has revealed to be the fine structure constant. This theory explains the so-called `dark matter' effect in spiral galaxies, the bore hole gravitational anomalies, the masses of the observed black holes at the centres of globular clusters, and the anomalies in Cavendish laboratory measurements of G." --Reginald Cahill

 

The Fine Structure & Dynamics of Self-Interacting Space, a Dimensionless Constant & Transient Realities

Revisioning Process-oriented Time, Space, Matter & Gravity

Process physics is a reformulation of key concepts to include quantized space. Process-oriented science is a radical information theoretic approach to the modeling fundamental physics. It aims toward a theory of everything by abandoning the space-time construct of Galileo, Newton and Einstein, and by arguing that only a process can model time. Process physics uses a process model of time rather than a geometrical model. It includes differences in past, present and future not derivable from the geometric model.

Time in process physics is modelled as an iterative process, where each iteration is like the next present moment. Due to the randomness present in the iterative equation, the future is not completely predictable. Also, it is not possible to perform the inverse operation, meaning you cannot go backwards to the previous moments. Process physics thus predicts a static past, a continually changing present moment, and an unpredictable future - all of which is consistent with how we experience the passage of time.

Randomness Paradigm

Randomness is more fundamental than objects. Expanding space isn't empty but full of topological defects, like snags in the fabric of space that share nonlocality -- linked by transient wormholes. Matter emerges into 3-d view while simultaneously concealing its virtual origin through self-organizing criticality (SOC). Process physics eliminates infinite regression while generating a complete theory of quantum measurements.

Quantum phenomena are caused by fractal topological defects embedded in and forming a growing three-dimensional fractal process-space, which is essentially a quantum foam. Other features of the emergent physics are: quantum field theory with emergent flavour and confined colour, limited causality and the Born quantum measurement metarule, inertia, time-dilation effects, gravity and the equivalence principle, a growing universe with a cosmological constant, black holes and event horizons, and the emergence of classicality.

Process Physics suggests a hierarchical model of reality featuring a Universe that exhibits behavior very reminiscent of living systems. How should we categorize space? Is it a `thing' or is it a `process'? From Greek philosophers to the present day this question has been examined again and again, but has remained unanswered. Recent discoveries have cast a new light on this essential core of existence, and the experimental evidence strongly indicates that modern day physics has made some wrong assumptions. Is reality a side-effect of randomness?

Absolute Motion

Space has internal structure. Space and quantum physics are emergent and unified, and described by a Quantum Homotopic Field (embedding one space in another). This structure is described as a network of nodal points with connections between nodes of varying strength.

Space has a foamy quantum structure. The speed of absolute motion is comparable to that determined from the Cosmic Background Radiation anisotropy, but the direction is not revealed. So absolute motion is meaningful and measureable, thus refuting Einstein's assumption. This discovery shows that a major re-assessment of the interpretation of the Special and General Relativity formalism is called for, a task already provided by Process Physics.

Self-Organizing Iterative Seed Process

Reality can be modelled as self-organizing semantic or relational information using a self-referentially limited neural network model (trees of self-referential strong connections), where the information-theoretic limitations are implemented via self-referential noise, a fundamental aspect of reality. This modelling was motivated by the discovery that such stochastic neural networks are foundational to known quantum field theories.

In Process Physics time is a distinct non-geometric process while space and quantum physics are emergent and unified. Quantum phenomena are caused by fractal topological defects embedded in and forming a growing three-dimensional fractal process-space, which is essentially a quantum foam.

Embedded topological defects act as sinks for aether flow, i.e. ordinary matter. Reasonably, another set of embedded topological defects stabilized out of the initial chaos with the complementary property that they act as sources of aether flow. By producing aether, those sources would effectively move away from everything else and could thereby drive the Hubble expansion.

Sets of strongly linked nodes persist and become further linked by the iterator to form a three-dimensional process space with embedded topological defects. In this way the stochastic neural-network creates stable strange attractors while determining their interaction properties. This information is all internal to the system; it is the semantic information within the network.

Other features are the emergence of quantum field theory with flavour and confined colour, limited causality and the Born quantum measurement metarule, inertia, time-dilation effects, gravity and the equivalence principle, a growing universe with a cosmological constant, black holes and event horizons, and the emergence of classicality. The unification of the quantum foam structure of space with the quantum nature of matter amounts to the discovery of quantum gravity.

Quantum Gravity

Gravity is apparently an aspect of space which has gone through various possible explanations, from the force concept by Newton, as expressed in his famous inverse square law, to Einstein's curved spacetime formalism. However experiment and observations suggest that both of these explanations are seriously flawed. Both are in strong disagreement with observation.

Both the Newtonian and General Relativity theories for gravity may be re-formulated as turbulent in-flow dynamics in which a substratum is effectively absorbed by matter, with the gravitational force determined by inhomogeneities of that flow. Gravity is an inhomogeneous flow of quantum foam into matter.

Quantum gravity, as manifested in the emergent Quantum Homotopic Field Theory of the process-space or quantum foam, is logically prior to the emergence of the general relativity phenomenology, and cannot be derived from it. Gravity is essentially an in-flow effect associated with the loss of information.

Matter & Mass

Matter is described as topological defects in three dimensional space that have the ability to become persistent by preserving the pattern of its links over many iterations. Matter is embedded in three dimensional space but is essentially made of the same thing as space. It moves by re-linking preferentially in the direction of travel and losing links more often in the opposite direction to travel. The pattern therefore appears to move relative to the underlying fabric of space and to other matter. Once the movement has started then it will become self sustaining requiring no more energy to continue. Any change in direction to its passage through space would be resisted, which manifests itself as inertia.

The topological defect nature of matter means it has more links than normal space. This would produce the effect of using up more links than the space surrounding it meaning that space would effectively sink into matter. This is speculated as the reason behind gravity where the space between masses would effectively shrink making the masses become closer together. The masses would not move as such but the distance between them would get smaller.

This also explains why a free falling body does not seem to experience a force while accelerating due to gravity towards a more massive body. This however goes against general relativity as the gravitational effect would be instantaneous rather than effect at a distance at the speed of light. An experiment to measure the speed of gravity would go a long way in establishing if general relativity or process physics is closer to reality.

Dark Matter Effect

The galactic `dark matter' effect is regarded as one of the major problems in fundamental physics. Process physics explains it as a self-interaction dynamical effect of space itself, not caused by an unknown form of matter. So-called `dark matter' networks revealed via weak gravitational lensing are caused by quantum-foam vortex filaments, and that the `frame-dragging' effect is caused by vorticity in the in-flow.

This leads to the new phenomenon of gravitational attractors. By analysing the borehole data this new constant is shown to be the fine structure constant alpha = e^2/h-bar*c ~ 1/137. The spiral galaxy `dark matter' effect and the globular cluster central `black hole' masses for M15 and G1 are then predicted.

The Process Physics theory of space is non-local and we see many parallels between this and quantum theory, in addition to the fine structure constant manifesting in both, so supporting the argument that space is a quantum foam system, as implied by the deeper information-theoretic theory known as Process Physics. The spatial dynamics also provides an explanation for the `dark matter' effect and as well the non-locality of the dynamics provides a mechanism for generating the uniformity of the universe, so explaining the cosmological horizon problem.

Data shows that both Newtonian gravity and General Relativity are also seriously flawed, and a new theory of gravity explains various so-called gravitational `anomaliesÌ, including the `dark matter' effect. The new `quantum-foam in-flow' theory of gravity has explained numerous so-called gravitational anomalies, particularly the `dark matter' effect which is now seen to be a dynamical effect of space itself, and whose strength is determined by the fine structure constant, and not by Newton's gravitational constant G. The emergent theory of gravity has two gravitational constants: G - Newton's constant, and a dimensionless constant. Various experiments and astronomical observations have shown that this constant is the fine structure constant: 1/137.

There is a detectable local preferred frame of reference. A new theory of gravity is necessary that results in an explanation of the `dark matter' effect. The fine structure constant is a 2nd gravitational constant. A new theory of gravity predicts, however, a second and much larger `frame-dragging' or vorticity induced spin precession. This spin precession component will also display the effects of novel gravitational waves which are predicted by the new theory of gravity, and which have already been seen in several experiments.

A theory of 3-space explains the phenomenon of gravity as arising from the time-dependence and inhomogeneity of the differential flow of this 3-space. Turbulence in the flow amounts to gravitational waves. The new dynamical `quantum foam' theory of 3-space is described at the classical level by a velocity field. This has been repeatedly detected and the dynamical equations are now established. These equations predict 3-space `gravitational wave' effects, and these have been observed, and the 1991 DeWitte data is analysed to reveal the fractal structure of these `gravitational waves'.

PROCESS PHYSICS

Process Physics

Professor Reg Cahill
Dr. Christopher Klinger
Dr Kirsty Kitto
Dr Lance McCarthy

http://www.scieng.flinders.edu.au/cpes/people/cahill_r/processphysics.html

A new paradigm for the modelling of reality is currently being developed called Process Physics. In Process Physics we start from the premise that the limits to logic, which are implied by Gödel's incompleteness theorems, mean that any attempt to model reality via a formal system is doomed to failure. Instead of formal systems we use a process system, which uses the notions of self-referential information with self-referential noise and self-organised criticality to create a new type of information-theoretic system that is realising both the current formal physical modelling of reality but is also exhibiting features such as the direction of time, the present moment effect and quantum state entanglement (including EPR effects, nonlocality and contextuality), as well as the more familiar formalisms of Relativity and Quantum Mechanics. In particular a theory of Gravity has already emerged.


In short, rather than the static 4-dimensional modelling of present day (non-process) physics, Process Physics is providing a dynamic model where space and matter are seen to emerge from a fundamentally random but self-organising system. The key insight is that to adequately model reality we must move on from the traditional non-process syntactical information modelling to a process semantic information modelling; such information is `internally meaningful'.

The new theory of gravity which has emerged from Process Physics is in agreement with all experiments and observations. This theory has two gravitational constants: G, the Newtonian gravitational constant, and a second dimensionless constant which experiment has revealed to be the fine structure constant. This theory explains the so-called `dark matter' effect in spiral galaxies, the bore hole gravitational anomalies, the masses of the observed black holes at the centres of globular clusters, and the anomalies in Cavendish laboratory measurements of G. As well it gives a parameter-free account of the supernovae Hubble expansion data without the need for dark energy, dark matter nor accelerating universe. This reveals that the Friedmann equations are inadequate for describing the universe expansion dynamics.

The experimental and theoretical research program to study and develop this theory of gravity was supported in 2005-2006 by an Australian Research Council Discovery Grant .


A Quantum Cosmology: No Dark Matter, Dark Energy nor Accelerating Universe

Abstract: We show that modelling the universe as a pre-geometric system with emergent quantum modes, and then constructing the classical limit, we obtain a new account of space and gravity that goes beyond Newtonian gravity even in the non-relativistic limit. This account does not require dark matter to explain the spiral galaxy rotation curves, and explains as well the observed systematics of black hole masses in spherical star systems, the bore hole g anomalies, gravitational lensing and so on. As well the dynamics has a Hubble expanding universe solution that gives an excellent parameter-free account of the supernovae and gamma-ray-burst red-shift data, without dark energy or dark matter.


Dynamical 3-Space: A Review
To be published in Physical Interpretations of Relativity Theory
Abstract: For some 100 years physics has modelled space and time via the spacetime concept, with space being merely an observer dependent perspective effect of that spacetime - space itself had no observer independent existence - it had no ontological status, and it certainly had no dynamical description. In recent years this has all changed. In 2002 it was discovered that a dynamical 3-space had been detected many times, including the Michelson-Morley 1887 light-speed anisotropy experiment. Here we review the dynamics of this 3-space, tracing its evolution from that of an emergent phenomena in the information-theoretic Process Physics to the phenomenological description in terms of a velocity field describing the relative internal motion of the structured 3-space.

Process physics is a new and radical approach to the modeling of fundamental physics drawing on information theory. It aims to be a theory of everything by abandoning the space-time construct of Galileo, Newton and Einstein, and by arguing that time can only be modeled as a process. The abandonment of time as a geometrical construct is used to solve the problems with conventional physics such as the incompatibility between the theories of general relativity and quantum mechanics. The model exhibits both gravitational and non-local quantum mechanical behaviour, uniting them in one theory. Space, matter, gravity and time seem to emerge from the model without the pre-existing notion of object or laws built in the model.

Taking as its starting point the incompleteness results of Goedel and Chaitin, and the Heraclitian view of existence as flux, or a continuously changing process, process physics attempts to model fundamental reality as a Heraclitian Process System (HPS) rather than as the behaviour of a set of discrete objects.

Physics to date has largely followed the intuition of Democritus: that the universe is composed of fundamental particles, with a defined spatial position. This has led to very successful geometrical models (Galileo, Newton) of physical behaviour, culminating in Einstein's General Relativity.

However, it is common to these geometrical models that they cannot account for the difference between past and future events, or the uniqueness of the present. To see this, we need only look at the equations of motion of any of those models. They are static functions, that is you can draw them as lines on a graph with time as one of the axes, and no one point on the line is different in kind from any other point - there is nothing to distinguish the 'now' from the other points on the line. This has lead some to assert that the universe is completely deterministic - in 4-D spacetime, the future 'already' exists.

The formulation of quantum theory, while allowing non-determinism through quantum randomness, gives rise to a different problem. The motion of particles is treated as the evolution of a wave function, but the 'size' of the wave function can be significantly larger than that of the particles whose motion it is meant to model. This was interpreted to mean that the wave function gave a probability distribution for these particle events (for example, the triggering of a detector, eg a ccd in a two-slit experiment.) But nowhere in the formalism is there a description of these events - it deals only with the wave functions.

These problematic events were brushed under the carpet with the Copenhagen interpretation which invoked an extra-systemic element (the observer) to account for them.

As if these problems weren't enough, incompleteness results (such as those of Goedel, Turing and Chaitin) have established that any formalism of sufficient richness to support self-referential statements will necessarily admit of 'random' truths which are not theorems of that formalism. If we wish to describe physical behaviour by means of a formalism (and this is the direction taken by two and a half millennia of physics) then these 'random' truths may be regarded as uncaused physical facts - something of an embarassment for a hard science!

In contrast to these approaches, process physics attempts to

"model reality as self-organising relational information [... taking] account of the limitations of formalism or logic by using the new concept of self-referential noise."
The self-referential noise (SRN) concept seems to be based on the idea that a self organising system (cf. Prigogine) even if closed, may give rise to intrinsic randomness.

This counter-intuitive idea is justified by considering the Goedelian perspective that truth can not be represented by finite means inside a self-referential system, and Chaitin's demonstration that even arithmetic is subject to fundamentally irreducible randomness.

Taking the view that arithmetic, numbers themselves, even, are emergent phenomena - as numbers are conceptually founded on the flawed and posterior notion of objects - process physicists turn this into an asset and choose to encapsulate the phenomenon as SRN, seeing this as the basis for quantum indeterminacy and the very contingency of contingent truths: the 'measurements' of the Copenhagen interpretation.

Process physics, therefore, seeks to achieve universality by modeling fundamental reality using a fractal (ie. scale-neutral) process-space taking the form of a directed graph, of which the natural measure is the connectivity of pairs of its nodes. The elements themselves (called monads, after Leibniz) are simply similar directed graphs. The fractal nature of these graphs is imparted through their generation by a non-linear iterative process. The SRN appears as a noise term in the iteration.

This represents an attempt to side-step the problem of fundamental constituents (Democritus' atoms, if you like) or, as they put it, 'bootstrap' their description of the universe. The focus on the structure of the fractal graph (a directed graph whose every node is itself a a directed graph) is intended to allow the requirement for objects (the 'nodes') to drop out of consideration. http://everything2.com/title/process%2520physics

 

1: Biosystems. 1997;42(2-3):177-90.Links

Origin of life and the underlying physics of the universe.

Department of Computer Science, Wayne State University, Detroit, MI 48202, USA. conrad@cs.wayne.edu

The thesis is put forward that the non-linear self-organizing dynamics of biological systems are inherent in any physical theory that satisfies the requirements of both quantum mechanics and general relativity. Biological life is viewed as an extension of these underlying dynamics rather than as an emergent property of systems that reached a requisite threshold of complexity at a definite point in time. The underlying dynamics are based on interactions between manifest material organizations and an unmanifest vacuum sea whose density structure is isomorphic to the metric structure of space-time. These interactions possess an intrinsic self-corrective character, due to the fact that quantum processes lead to changes in particle states that have a random aspect, while general relativity requires that the distribution of manifest and unmanifest particles be self-consistent. The model implies vacuum hysteretic effects that would bear on nanobiological phenomena and that might be detected through nanobiological techniques.

The localized quantum vacuum field

D Dragoman 2008 Phys. Scr. 77 035005 (7pp)


D Dragoman
Physics Department, University of Bucharest, PO Box MG-11, 077125 Bucharest, Romania
E-mail: danieladragoman@yahoo.com

Abstract. A model for the localized quantum vacuum is proposed in which the zero-point energy (ZPE) of the quantum electromagnetic field originates in energy- and momentum-conserving transitions of material systems from their ground state to an unstable state with negative energy. These transitions are accompanied by emissions and re-absorptions of real photons, which generate a localized quantum vacuum in the neighborhood of material systems. The model could help resolve the cosmological paradox associated with the ZPE of electromagnetic fields, while reclaiming quantum effects associated with quantum vacuum such as the Casimir effect and the Lamb shift. It also offers a new insight into the Zitterbewegung of material particles.

PACS numbers: 03.65.−w, 12.20.−m

Print publication: Issue 3 (March 2008)
Received 9 August 2007, accepted for publication 9 January 2008
Published 8 February 2008

Amplification and squeezing of quantum noise with a tunable Josephson metamaterial

Nature Physics, Volume 4, Issue 12, pp. 929-931 (2008).
It has recently become possible to encode the quantum state of superconducting qubits and the position of nanomechanical oscillators into the states of microwave fields. However, to make an ideal measurement of the state of a qubit, or to detect the position of a mechanical oscillator with quantum-limited sensitivity, requires an amplifier that adds no noise. If an amplifier adds less than half a quantum of noise, it can also squeeze the quantum noise of the electromagnetic vacuum. Highly squeezed states of the vacuum can be used to generate entanglement or to realize back-action-evading measurements of position. Here we introduce a general-purpose parametric device, which operates in a frequency band between 4 and 8GHz. It adds less than half a noise quantum, it amplifies quantum noise above the added noise of commercial amplifiers and it squeezes quantum fluctuations by 10 dB.