Sunday, October 12, 2008
QM and the problem of Being
Here are some remarks on the practical implementation of the scientific paradigm. They sketch a semantic analysis of the reproducibility requirement and cast scientific results as rhetorical tools, with pointers for the deconstruction of some concrete examples, from the recent Gravity Probe B farce (arguably foreseen in [1], cf [2], [3] also available at [3A]: "The gap between the current error level and that which is required for a rigorous test of a deviation from GR is so large that any effect ultimately detected by this experiment will have to overcome considerable (and in our opinion, well justified) skepticism in the scientific community", [4]) to my old favorite univalence superselection ([5], [6], [7] ) all the way to a deconstructed notion of physical law ([8]). The starting point is this sentence from Plato's Sophist, which famously inspired Martin Heidegger as well as others in his wake: "For manifestly you have long been aware of what you mean when you use this expression 'being'. We, however, who used to think we understood it, have now become perplexed". The inability/unwillingness of contemporary scientific thought to tackle the postulate of existence is arguably the core problem in quantum mechanics' "unfinished revolution". Without a deconstruction of the "a priori" semantic assumptions hidden in (our "scientific" image of) the world, the way ahead, beyond pirouettes in front of blatant experimental failure, will stay blocked. While such a deconstruction has been initiated by the relational approach to quantum mechanics, both its scope and its impact are still limited. Within RQM's epistemic framework one can recover some of the far-reaching arguments by Philip Warren Anderson against the reductionist approach. The mantra that "big things are made up of small things" is often tacitly assumed in scientific models. In a RQM perspective there are no things, either big or small, but only measurement outcomes/perceptions relative to a measurement operator. If this post didn't put you too sleep, there is more below.
Thursday, September 25, 2008
Very bad particles
Davis writes in his Nobel lecture "the standard solar model has ended in a spectacular way: nothing was wrong with the experiments or the theory; something was wrong with the neutrinos,in the sense that they behave in ways beyond the standard model of particle physics".
Naughty neutrinos! As soon as they left CERN, they started to misbehave.
Naughty neutrinos! As soon as they left CERN, they started to misbehave.
Monday, February 25, 2008
Deconstructing Decoherence
In 2002 Anthony Leggett, whose mainstream credentials can do without my seal of approval, memorably wrote ([1]):
"This argument, with the conclusion that observation of QIMDS [quantum interference between macroscopically distinct states] will be in practice totally impossible, must appear literally thousands of times in the literature of the last few decades on the quantum measurement problem, to the extent that it was at one time the almost universally accepted orthodoxy [apparently it still is].
...
Let us now try to assess the decoherence argument. Actually, the most economical tactic at this point would be to go directly to the results of the next section, namely that it is experimentally refuted! However, it is interesting to spend a moment enquiring why it was reasonable to anticipate this in advance of the actual experiments. In fact, the argument contains several major loopholes ...".
The brackets are mine. Since 1999 I have been shouting over the rooftops that decoherence theory is essentially flawed ([2], [3]). While it's pleasant to get some company, decoherence still rules the mainstream. It's interesting that Leggett graduated in Classics before he turned to physics. He argues ([4]) that his classic education shaped the way he later looked at physics, enabling him to identify and evaluate his assumptions in a way that most other physicists could not. That's indeed a precious and rare skill when it comes to the semantic problem, which in various forms haunts the current scientific debate. Quite concretely, it is only through a novel semantic framework such as RQM that the spurious non-locality of entanglement has been liquidated.
"This argument, with the conclusion that observation of QIMDS [quantum interference between macroscopically distinct states] will be in practice totally impossible, must appear literally thousands of times in the literature of the last few decades on the quantum measurement problem, to the extent that it was at one time the almost universally accepted orthodoxy [apparently it still is].
...
Let us now try to assess the decoherence argument. Actually, the most economical tactic at this point would be to go directly to the results of the next section, namely that it is experimentally refuted! However, it is interesting to spend a moment enquiring why it was reasonable to anticipate this in advance of the actual experiments. In fact, the argument contains several major loopholes ...".
The brackets are mine. Since 1999 I have been shouting over the rooftops that decoherence theory is essentially flawed ([2], [3]). While it's pleasant to get some company, decoherence still rules the mainstream. It's interesting that Leggett graduated in Classics before he turned to physics. He argues ([4]) that his classic education shaped the way he later looked at physics, enabling him to identify and evaluate his assumptions in a way that most other physicists could not. That's indeed a precious and rare skill when it comes to the semantic problem, which in various forms haunts the current scientific debate. Quite concretely, it is only through a novel semantic framework such as RQM that the spurious non-locality of entanglement has been liquidated.
Tuesday, February 05, 2008
LSD pointers
In his recent book 'Anassimandro di Mileto' Rovelli writes: "it is essential to realise that language doesn't just mirror reality, but more often it creates reality". Having recognized the deconstruction of semantic models as a necessary development of the quantum relational approach , it is curious for me to find this statement in a text where RQM is nowhere mentioned. Yet apparently, the seeds are germinating. Rovelli writes further that "to take part in the ritual amounts to acknowledging its legitimacy, and hence to adhere to the realm of meanings corresponding to the ritual". Again, an impressingly bold statement when viewed in a relational setting, where intersubjective agreement and hence reality depend on a shared semantic model.
How semantic models, and hence the corresponding reality, can be modified is stated further: "100 micrograms of lysergic acid diethylamide are sufficient to let us perceive the world in a deeply different way. Neither a more truthful way nor a less truthful one. Just different". One may find similar statement in Kary Mullis' writings. However, the implementation of such alternative semantic models breaks intersubjective agreement. A different semantic model and the corresponding measurement/perceptual procedure may land you in an alternative reality, where others may be unable or unwilling to tread.
How semantic models, and hence the corresponding reality, can be modified is stated further: "100 micrograms of lysergic acid diethylamide are sufficient to let us perceive the world in a deeply different way. Neither a more truthful way nor a less truthful one. Just different". One may find similar statement in Kary Mullis' writings. However, the implementation of such alternative semantic models breaks intersubjective agreement. A different semantic model and the corresponding measurement/perceptual procedure may land you in an alternative reality, where others may be unable or unwilling to tread.
Thursday, January 31, 2008
RQM, markets and Fechner's friends
In RQM reality emerges as the locus of intersubjective agreement in an epistemic web of question and answer sessions (cf. here on mid p.4). Such a framework appears quite adequate for financial markets, where transactions are determined by agreement on ask and bid prices. The identification of statistical and quantum probabilities, proposed a.o. by Wald in a cosmological setting, makes the analogy even closer.
Moreover, the issue of universality, which I touched upon in my previous post, recurs here. The analogies between man-made market crashes and critical phenomena in physics have been widely studied in the past few years. It is tempting to construe structural similarities as evidence of a common origin in the observer's perceptual paradigm, which gives shape to events as they emerge through observation from the turbulent quantum flow.
Historical footnote. Entanglement was officially born with the famous EPR paper, but it haunted Western philosophy long before that. When Geulincx and Malebranche tackled the mind/body problem, the epistemic condition on causality that they proposed as the basis of necessary connections is reminiscent of the current RQM view, where entanglement upholds physical laws by imposing constraints on distinct perceptions/measurement outcomes. Malebranche's seeds would later germinate in Merleau-Ponty's idea that "existence is projective and results from an intersubjective dialogue between the embodied subject and the external world", where RQM finds an inviting philosophical counterpart.
As for the crucial question about the status of the observer, Thom writes that "all modern science is based on the postulate of the stupidity of objects". However here is a Schroedinger quote: " ... the genius of Gustav Theodor Fechner did not shy at attributing a soul, to the earth as a celestial body, to the planetary system, etc. I do not fall in with those fantasies, yet I should not like to have to pass judgement as to who has come nearest to the deepest truth, Fechner or the bankrupts of rationalism.". In a relational setting, where people or devices ignore Descartes' arbitrary animate/inanimate distinction and are identified as observers only relationally, Fechner's approach may score some points.
Moreover, the issue of universality, which I touched upon in my previous post, recurs here. The analogies between man-made market crashes and critical phenomena in physics have been widely studied in the past few years. It is tempting to construe structural similarities as evidence of a common origin in the observer's perceptual paradigm, which gives shape to events as they emerge through observation from the turbulent quantum flow.
Historical footnote. Entanglement was officially born with the famous EPR paper, but it haunted Western philosophy long before that. When Geulincx and Malebranche tackled the mind/body problem, the epistemic condition on causality that they proposed as the basis of necessary connections is reminiscent of the current RQM view, where entanglement upholds physical laws by imposing constraints on distinct perceptions/measurement outcomes. Malebranche's seeds would later germinate in Merleau-Ponty's idea that "existence is projective and results from an intersubjective dialogue between the embodied subject and the external world", where RQM finds an inviting philosophical counterpart.
As for the crucial question about the status of the observer, Thom writes that "all modern science is based on the postulate of the stupidity of objects". However here is a Schroedinger quote: " ... the genius of Gustav Theodor Fechner did not shy at attributing a soul, to the earth as a celestial body, to the planetary system, etc. I do not fall in with those fantasies, yet I should not like to have to pass judgement as to who has come nearest to the deepest truth, Fechner or the bankrupts of rationalism.". In a relational setting, where people or devices ignore Descartes' arbitrary animate/inanimate distinction and are identified as observers only relationally, Fechner's approach may score some points.
Monday, January 07, 2008
Measurement semantics
In 'Semantics and Linguistics' Renè Thom wrote that "a view of semantics hinting at global coherence leads to an ontology". Following this clue one may investigate the ontological constraints corresponding to the semantics models that we adopt (or which define us). For start the scientific approach, which I subsume into the requirement of reproducibility, posits a community of observers, who may or may not reproduce each other's measurement outcomes. Such a requirement does not imply the existence of an objective reality, commonly labeled Nature. In order to implement a reproducibility-based approach, observers just need to be able to exchange information about their measurement outcomes (a.k.a. perceptions). Therefore, within a relational framework, where reality is epistemic rather than ontologic, one can do science without Nature.
Indeed in a relational setting "reality" corresponds to a web of measurements (information exchanges), arising from question and answer sessions. As long as the questions are formulated within a semantic framework (a set of observables or "pointers", which correspond to a measurement procedure ), the answers. i.e. the measurement outcomes, depend on the framework too. So one can also adopt different semantic models, as people in fact routinely do, if they are appropriate for different phenomenologies (or Natures, as you may want to call them).
Take, for example, particle physics. In quantum mechanics the basic semantic entities are the wave function and measurement operators. In particle physics however the basic semantic entities are particles. which are in turn associated on "ad hoc" basis to clusters of measurement outcomes. It is commonly claimed that particles are "the building blocks" of the universe, rather than those of a semantic model describing a certain phenomenology. While the Standard model has some genuine predictive power when applied to certain sets of measurements, one can question the claim that such measurements are in any way fundamental. Within the Wilson renormalization group approach, the relevant RG tranformations actually encode the measurement (i.e. semantic) blueprints underlying the model, from which the various particles arise. Unless your salary depends on it, there is hardly any reason to assume that measuring them is more fundamental than observing phonons, salmons or motorcycles. I may also snipe that the Standard Model allows for casual patch-ups in the face of predictive failure, as documented in the case of solar neutrinos. Therefore, outside the labs where the SM was created, its falsifiability and hence is scientific value are questionable.
The above remarks are partly inspired by the semantic problems presented by electrino bubbles in liquid helum and by the apparent dependence of De Broglie's wavelength on a semantic construct (the "object") corresponding to the relevant measurement procedure. I also surmise that morphogenesis arises from an observer-dependent measurement blueprint, encoded by the RG tranformations, and that convergence mirrors structural stability, à la Thom. Universality may be a reflection of the observer's perceptual archetypes, corresponding to his/her measurement procedures.
To this point I add two quotes, which I regard as seminal. The first one is by Shirkov ([1]), who points out that "to specify a quantum system, it is necessary to fix its “macroscopic surrounding”, i.e., to give the properties of macroscopic devices used in the measurement process. Just these devices are described by additional parameters, like [the scale parameter] μ". The second is by Jona-Lasinio, who in [2] observes that "a critical point can be characterized by deviations from the central limit theorem" and argues further that "to any type of RG transformation one can associate a multiplicative structure, a cocycle, and the characterising feature of the Green’s function RG is that it is defined directly in terms of this structure. In the probabilistic setting the multiplicative structure is related to the properties of conditional expectations". A full theory of morphogenesis awaits to be developed on these foundations.
Indeed in a relational setting "reality" corresponds to a web of measurements (information exchanges), arising from question and answer sessions. As long as the questions are formulated within a semantic framework (a set of observables or "pointers", which correspond to a measurement procedure ), the answers. i.e. the measurement outcomes, depend on the framework too. So one can also adopt different semantic models, as people in fact routinely do, if they are appropriate for different phenomenologies (or Natures, as you may want to call them).
Take, for example, particle physics. In quantum mechanics the basic semantic entities are the wave function and measurement operators. In particle physics however the basic semantic entities are particles. which are in turn associated on "ad hoc" basis to clusters of measurement outcomes. It is commonly claimed that particles are "the building blocks" of the universe, rather than those of a semantic model describing a certain phenomenology. While the Standard model has some genuine predictive power when applied to certain sets of measurements, one can question the claim that such measurements are in any way fundamental. Within the Wilson renormalization group approach, the relevant RG tranformations actually encode the measurement (i.e. semantic) blueprints underlying the model, from which the various particles arise. Unless your salary depends on it, there is hardly any reason to assume that measuring them is more fundamental than observing phonons, salmons or motorcycles. I may also snipe that the Standard Model allows for casual patch-ups in the face of predictive failure, as documented in the case of solar neutrinos. Therefore, outside the labs where the SM was created, its falsifiability and hence is scientific value are questionable.
The above remarks are partly inspired by the semantic problems presented by electrino bubbles in liquid helum and by the apparent dependence of De Broglie's wavelength on a semantic construct (the "object") corresponding to the relevant measurement procedure. I also surmise that morphogenesis arises from an observer-dependent measurement blueprint, encoded by the RG tranformations, and that convergence mirrors structural stability, à la Thom. Universality may be a reflection of the observer's perceptual archetypes, corresponding to his/her measurement procedures.
To this point I add two quotes, which I regard as seminal. The first one is by Shirkov ([1]), who points out that "to specify a quantum system, it is necessary to fix its “macroscopic surrounding”, i.e., to give the properties of macroscopic devices used in the measurement process. Just these devices are described by additional parameters, like [the scale parameter] μ". The second is by Jona-Lasinio, who in [2] observes that "a critical point can be characterized by deviations from the central limit theorem" and argues further that "to any type of RG transformation one can associate a multiplicative structure, a cocycle, and the characterising feature of the Green’s function RG is that it is defined directly in terms of this structure. In the probabilistic setting the multiplicative structure is related to the properties of conditional expectations". A full theory of morphogenesis awaits to be developed on these foundations.
Subscribe to:
Posts (Atom)