In RQM reality emerges as the locus of intersubjective agreement in an epistemic web of question and answer sessions (cf. here on mid p.4). Such a framework appears quite adequate for financial markets, where transactions are determined by agreement on ask and bid prices. The identification of statistical and quantum probabilities, proposed a.o. by Wald in a cosmological setting, makes the analogy even closer.
Moreover, the issue of universality, which I touched upon in my previous post, recurs here. The analogies between man-made market crashes and critical phenomena in physics have been widely studied in the past few years. It is tempting to construe structural similarities as evidence of a common origin in the observer's perceptual paradigm, which gives shape to events as they emerge through observation from the turbulent quantum flow.
Historical footnote. Entanglement was officially born with the famous EPR paper, but it haunted Western philosophy long before that. When Geulincx and Malebranche tackled the mind/body problem, the epistemic condition on causality that they proposed as the basis of necessary connections is reminiscent of the current RQM view, where entanglement upholds physical laws by imposing constraints on distinct perceptions/measurement outcomes. Malebranche's seeds would later germinate in Merleau-Ponty's idea that "existence is projective and results from an intersubjective dialogue between the embodied subject and the external world", where RQM finds an inviting philosophical counterpart.
As for the crucial question about the status of the observer, Thom writes that "all modern science is based on the postulate of the stupidity of objects". However here is a Schroedinger quote: " ... the genius of Gustav Theodor Fechner did not shy at attributing a soul, to the earth as a celestial body, to the planetary system, etc. I do not fall in with those fantasies, yet I should not like to have to pass judgement as to who has come nearest to the deepest truth, Fechner or the bankrupts of rationalism.". In a relational setting, where people or devices ignore Descartes' arbitrary animate/inanimate distinction and are identified as observers only relationally, Fechner's approach may score some points.
Thursday, January 31, 2008
Monday, January 07, 2008
Measurement semantics
In 'Semantics and Linguistics' Renè Thom wrote that "a view of semantics hinting at global coherence leads to an ontology". Following this clue one may investigate the ontological constraints corresponding to the semantics models that we adopt (or which define us). For start the scientific approach, which I subsume into the requirement of reproducibility, posits a community of observers, who may or may not reproduce each other's measurement outcomes. Such a requirement does not imply the existence of an objective reality, commonly labeled Nature. In order to implement a reproducibility-based approach, observers just need to be able to exchange information about their measurement outcomes (a.k.a. perceptions). Therefore, within a relational framework, where reality is epistemic rather than ontologic, one can do science without Nature.
Indeed in a relational setting "reality" corresponds to a web of measurements (information exchanges), arising from question and answer sessions. As long as the questions are formulated within a semantic framework (a set of observables or "pointers", which correspond to a measurement procedure ), the answers. i.e. the measurement outcomes, depend on the framework too. So one can also adopt different semantic models, as people in fact routinely do, if they are appropriate for different phenomenologies (or Natures, as you may want to call them).
Take, for example, particle physics. In quantum mechanics the basic semantic entities are the wave function and measurement operators. In particle physics however the basic semantic entities are particles. which are in turn associated on "ad hoc" basis to clusters of measurement outcomes. It is commonly claimed that particles are "the building blocks" of the universe, rather than those of a semantic model describing a certain phenomenology. While the Standard model has some genuine predictive power when applied to certain sets of measurements, one can question the claim that such measurements are in any way fundamental. Within the Wilson renormalization group approach, the relevant RG tranformations actually encode the measurement (i.e. semantic) blueprints underlying the model, from which the various particles arise. Unless your salary depends on it, there is hardly any reason to assume that measuring them is more fundamental than observing phonons, salmons or motorcycles. I may also snipe that the Standard Model allows for casual patch-ups in the face of predictive failure, as documented in the case of solar neutrinos. Therefore, outside the labs where the SM was created, its falsifiability and hence is scientific value are questionable.
The above remarks are partly inspired by the semantic problems presented by electrino bubbles in liquid helum and by the apparent dependence of De Broglie's wavelength on a semantic construct (the "object") corresponding to the relevant measurement procedure. I also surmise that morphogenesis arises from an observer-dependent measurement blueprint, encoded by the RG tranformations, and that convergence mirrors structural stability, à la Thom. Universality may be a reflection of the observer's perceptual archetypes, corresponding to his/her measurement procedures.
To this point I add two quotes, which I regard as seminal. The first one is by Shirkov ([1]), who points out that "to specify a quantum system, it is necessary to fix its “macroscopic surrounding”, i.e., to give the properties of macroscopic devices used in the measurement process. Just these devices are described by additional parameters, like [the scale parameter] μ". The second is by Jona-Lasinio, who in [2] observes that "a critical point can be characterized by deviations from the central limit theorem" and argues further that "to any type of RG transformation one can associate a multiplicative structure, a cocycle, and the characterising feature of the Green’s function RG is that it is defined directly in terms of this structure. In the probabilistic setting the multiplicative structure is related to the properties of conditional expectations". A full theory of morphogenesis awaits to be developed on these foundations.
Indeed in a relational setting "reality" corresponds to a web of measurements (information exchanges), arising from question and answer sessions. As long as the questions are formulated within a semantic framework (a set of observables or "pointers", which correspond to a measurement procedure ), the answers. i.e. the measurement outcomes, depend on the framework too. So one can also adopt different semantic models, as people in fact routinely do, if they are appropriate for different phenomenologies (or Natures, as you may want to call them).
Take, for example, particle physics. In quantum mechanics the basic semantic entities are the wave function and measurement operators. In particle physics however the basic semantic entities are particles. which are in turn associated on "ad hoc" basis to clusters of measurement outcomes. It is commonly claimed that particles are "the building blocks" of the universe, rather than those of a semantic model describing a certain phenomenology. While the Standard model has some genuine predictive power when applied to certain sets of measurements, one can question the claim that such measurements are in any way fundamental. Within the Wilson renormalization group approach, the relevant RG tranformations actually encode the measurement (i.e. semantic) blueprints underlying the model, from which the various particles arise. Unless your salary depends on it, there is hardly any reason to assume that measuring them is more fundamental than observing phonons, salmons or motorcycles. I may also snipe that the Standard Model allows for casual patch-ups in the face of predictive failure, as documented in the case of solar neutrinos. Therefore, outside the labs where the SM was created, its falsifiability and hence is scientific value are questionable.
The above remarks are partly inspired by the semantic problems presented by electrino bubbles in liquid helum and by the apparent dependence of De Broglie's wavelength on a semantic construct (the "object") corresponding to the relevant measurement procedure. I also surmise that morphogenesis arises from an observer-dependent measurement blueprint, encoded by the RG tranformations, and that convergence mirrors structural stability, à la Thom. Universality may be a reflection of the observer's perceptual archetypes, corresponding to his/her measurement procedures.
To this point I add two quotes, which I regard as seminal. The first one is by Shirkov ([1]), who points out that "to specify a quantum system, it is necessary to fix its “macroscopic surrounding”, i.e., to give the properties of macroscopic devices used in the measurement process. Just these devices are described by additional parameters, like [the scale parameter] μ". The second is by Jona-Lasinio, who in [2] observes that "a critical point can be characterized by deviations from the central limit theorem" and argues further that "to any type of RG transformation one can associate a multiplicative structure, a cocycle, and the characterising feature of the Green’s function RG is that it is defined directly in terms of this structure. In the probabilistic setting the multiplicative structure is related to the properties of conditional expectations". A full theory of morphogenesis awaits to be developed on these foundations.
Subscribe to:
Posts (Atom)