Thursday, September 29, 2011

Superluminal

Honestly, I didn't think Big Science still had it in itself to pull off something that good.

Sunday, January 18, 2009

Canetti and De Finetti

As an introduction to the main theme of this blog, here is a quote from the opening of Elias Canetti's "Crowds and Power" ("Massen und Macht" in the original German). It goes:

"Man fears nothing more than the touch of the Unknown. [...] . It is only in the crowd that man can be liberatated from his fear of being touched."

Canetti's insight applies to any event falling outside the semantic model that shapes a community's shared reality. As Bruno De Finetti put it, "the professionalization and departmentalization of the several branches of science have become an obstacle to the necessary continuous renewal of science itself". Indeed a 'specialist' cannot question the adequacy of the prevalent semantic models without implicitly questioning the pecking order of the academic community, thereby jeopardizing his/her own status within it.

None of this is really new. Pharnakes quipped a long time ago "... we are faced again with that stock manoeuvre of the Academy on each occasion that they engage in discourse with others they will not offer any accounting of their own assertions but must keep their interlocutors on the defensive lest they become the prosecutors" (Plutarch, De facie in orbe lunae, 6). Or, as Upton Sinclair said : "it is difficult to get a man to understand something, when his salary depends upon not understanding it."

Sunday, October 12, 2008

QM and the problem of Being

Here are some remarks on the practical implementation of the scientific paradigm. They sketch a semantic analysis of the reproducibility requirement and cast scientific results as rhetorical tools, with pointers for the deconstruction of some concrete examples, from the recent Gravity Probe B farce (arguably foreseen in [1], cf [2], [3] also available at [3A]: "The gap between the current error level and that which is required for a rigorous test of a deviation from GR is so large that any effect ultimately detected by this experiment will have to overcome considerable (and in our opinion, well justified) skepticism in the scientific community", [4]) to my old favorite univalence superselection ([5], [6], [7] ) all the way to a deconstructed notion of physical law ([8]). The starting point is this sentence from Plato's Sophist, which famously inspired Martin Heidegger as well as others in his wake: "For manifestly you have long been aware of what you mean when you use this expression 'being'. We, however, who used to think we understood it, have now become perplexed". The inability/unwillingness of contemporary scientific thought to tackle the postulate of existence is arguably the core problem in quantum mechanics' "unfinished revolution". Without a deconstruction of the "a priori" semantic assumptions hidden in (our "scientific" image of) the world, the way ahead, beyond pirouettes in front of blatant experimental failure, will stay blocked. While such a deconstruction has been initiated by the relational approach to quantum mechanics, both its scope and its impact are still limited. Within RQM's epistemic framework one can recover some of the far-reaching arguments by Philip Warren Anderson against the reductionist approach. The mantra that "big things are made up of small things" is often tacitly assumed in scientific models. In a RQM perspective there are no things, either big or small, but only measurement outcomes/perceptions relative to a measurement operator. If this post didn't put you too sleep, there is more below.

Thursday, September 25, 2008

Very bad particles

Davis writes in his Nobel lecture "the standard solar model has ended in a spectacular way: nothing was wrong with the experiments or the theory; something was wrong with the neutrinos,in the sense that they behave in ways beyond the standard model of particle physics".
Naughty neutrinos! As soon as they left CERN, they started to misbehave.

Monday, February 25, 2008

Deconstructing Decoherence

In 2002 Anthony Leggett, whose mainstream credentials can do without my seal of approval, memorably wrote ([1]):

"This argument, with the conclusion that observation of QIMDS [quantum interference between macroscopically distinct states] will be in practice totally impossible, must appear literally thousands of times in the literature of the last few decades on the quantum measurement problem, to the extent that it was at one time the almost universally accepted orthodoxy [apparently it still is].
...
Let us now try to assess the decoherence argument. Actually, the most economical tactic at this point would be to go directly to the results of the next section, namely that it is experimentally refuted! However, it is interesting to spend a moment enquiring why it was reasonable to anticipate this in advance of the actual experiments. In fact, the argument contains several major loopholes ...".

The brackets are mine. Since 1999 I have been shouting over the rooftops that decoherence theory is essentially flawed ([2], [3]). While it's pleasant to get some company, decoherence still rules the mainstream. It's interesting that Leggett graduated in Classics before he turned to physics. He argues ([4]) that his classic education shaped the way he later looked at physics, enabling him to identify and evaluate his assumptions in a way that most other physicists could not. That's indeed a precious and rare skill when it comes to the semantic problem, which in various forms haunts the current scientific debate. Quite concretely, it is only through a novel semantic framework such as RQM that the spurious non-locality of entanglement has been liquidated.

Tuesday, February 05, 2008

LSD pointers

In his recent book 'Anassimandro di Mileto' Rovelli writes: "it is essential to realise that language doesn't just mirror reality, but more often it creates reality". Having recognized the deconstruction of semantic models as a necessary development of the quantum relational approach , it is curious for me to find this statement in a text where RQM is nowhere mentioned. Yet apparently, the seeds are germinating. Rovelli writes further that "to take part in the ritual amounts to acknowledging its legitimacy, and hence to adhere to the realm of meanings corresponding to the ritual". Again, an impressingly bold statement when viewed in a relational setting, where intersubjective agreement and hence reality depend on a shared semantic model.

How semantic models, and hence the corresponding reality, can be modified is stated further: "100 micrograms of lysergic acid diethylamide are sufficient to let us perceive the world in a deeply different way. Neither a more truthful way nor a less truthful one. Just different". One may find similar statement in Kary Mullis' writings. However, the implementation of such alternative semantic models breaks intersubjective agreement. A different semantic model and the corresponding measurement/perceptual procedure may land you in an alternative reality, where others may be unable or unwilling to tread.

Thursday, January 31, 2008

RQM, markets and Fechner's friends

In RQM reality emerges as the locus of intersubjective agreement in an epistemic web of question and answer sessions (cf. here on mid p.4). Such a framework appears quite adequate for financial markets, where transactions are determined by agreement on ask and bid prices. The identification of statistical and quantum probabilities, proposed a.o. by Wald in a cosmological setting, makes the analogy even closer.

Moreover, the issue of universality, which I touched upon in my previous post, recurs here. The analogies between man-made market crashes and critical phenomena in physics have been widely studied in the past few years. It is tempting to construe structural similarities as evidence of a common origin in the observer's perceptual paradigm, which gives shape to events as they emerge through observation from the turbulent quantum flow.

Historical footnote. Entanglement was officially born with the famous EPR paper, but it haunted Western philosophy long before that. When Geulincx and Malebranche tackled the mind/body problem, the epistemic condition on causality that they proposed as the basis of necessary connections is reminiscent of the current RQM view, where entanglement upholds physical laws by imposing constraints on distinct perceptions/measurement outcomes. Malebranche's seeds would later germinate in Merleau-Ponty's idea that "existence is projective and results from an intersubjective dialogue between the embodied subject and the external world", where RQM finds an inviting philosophical counterpart.

As for the crucial question about the status of the observer, Thom writes that "all modern science is based on the postulate of the stupidity of objects". However here is a Schroedinger quote: " ... the genius of Gustav Theodor Fechner did not shy at attributing a soul, to the earth as a celestial body, to the planetary system, etc. I do not fall in with those fantasies, yet I should not like to have to pass judgement as to who has come nearest to the deepest truth, Fechner or the bankrupts of rationalism.". In a relational setting, where people or devices ignore Descartes' arbitrary animate/inanimate distinction and are identified as observers only relationally, Fechner's approach may score some points.

Monday, January 07, 2008

Measurement semantics

In 'Semantics and Linguistics' Renè Thom wrote that "a view of semantics hinting at global coherence leads to an ontology". Following this clue one may investigate the ontological constraints corresponding to the semantics models that we adopt (or which define us). For start the scientific approach, which I subsume into the requirement of reproducibility, posits a community of observers, who may or may not reproduce each other's measurement outcomes. Such a requirement does not imply the existence of an objective reality, commonly labeled Nature. In order to implement a reproducibility-based approach, observers just need to be able to exchange information about their measurement outcomes (a.k.a. perceptions). Therefore, within a relational framework, where reality is epistemic rather than ontologic, one can do science without Nature.

Indeed in a relational setting "reality" corresponds to a web of measurements (information exchanges), arising from question and answer sessions. As long as the questions are formulated within a semantic framework (a set of observables or "pointers", which correspond to a measurement procedure ), the answers. i.e. the measurement outcomes, depend on the framework too. So one can also adopt different semantic models, as people in fact routinely do, if they are appropriate for different phenomenologies (or Natures, as you may want to call them).

Take, for example, particle physics. In quantum mechanics the basic semantic entities are the wave function and measurement operators. In particle physics however the basic semantic entities are particles. which are in turn associated on "ad hoc" basis to clusters of measurement outcomes. It is commonly claimed that particles are "the building blocks" of the universe, rather than those of a semantic model describing a certain phenomenology. While the Standard model has some genuine predictive power when applied to certain sets of measurements, one can question the claim that such measurements are in any way fundamental. Within the Wilson renormalization group approach, the relevant RG tranformations actually encode the measurement (i.e. semantic) blueprints underlying the model, from which the various particles arise. Unless your salary depends on it, there is hardly any reason to assume that measuring them is more fundamental than observing phonons, salmons or motorcycles. I may also snipe that the Standard Model allows for casual patch-ups in the face of predictive failure, as documented in the case of solar neutrinos. Therefore, outside the labs where the SM was created, its falsifiability and hence is scientific value are questionable.

The above remarks are partly inspired by the semantic problems presented by electrino bubbles in liquid helum and by the apparent dependence of De Broglie's wavelength on a semantic construct (the "object") corresponding to the relevant measurement procedure. I also surmise that morphogenesis arises from an observer-dependent measurement blueprint, encoded by the RG tranformations, and that convergence mirrors structural stability, à la Thom. Universality may be a reflection of the observer's perceptual archetypes, corresponding to his/her measurement procedures.

To this point I add two quotes, which I regard as seminal. The first one is by Shirkov ([1]), who points out that "to specify a quantum system, it is necessary to fix its “macroscopic surrounding”, i.e., to give the properties of macroscopic devices used in the measurement process. Just these devices are described by additional parameters, like [the scale parameter] μ". The second is by Jona-Lasinio, who in [2] observes that "a critical point can be characterized by deviations from the central limit theorem" and argues further that "to any type of RG transformation one can associate a multiplicative structure, a cocycle, and the characterising feature of the Green’s function RG is that it is defined directly in terms of this structure. In the probabilistic setting the multiplicative structure is related to the properties of conditional expectations". A full theory of morphogenesis awaits to be developed on these foundations.

Monday, December 24, 2007

Safe Science

Relational quantum mechanics is still a niche approach, but I bet it's the way of the future. It is conceptually fertile. It allows to define reality as the locus of intersubjective agreement (cf. mid p.4), so that different realities may correspond in different observer constituencies, e.g. homeopaths and non-homeopaths, and dreams are the ultimate instance, as Schopenhauer did not quite state, of a one-man reality. In this setting reproducibility, the core value of the scientific method, is a tool to extend agreement on a phenomenon to a larger community. The reproducibility of phenomena and the corresponding reality may vary in strength across communities, from extremely robust, Hiroshima-type, to exceedingly weak, as the results supporting theories regarded as pseudo-scientific usually are . Realities, i.e. loci of intersubjective agreement, may also expand and contract in time, as the history of science teaches us through such vivid examples as the rediscovered water-featuring M'pemba effect. This framework allows to identify the conditions and the obstacles, such as unwitting or wilful use of different standards and semantic models, that may affect reproducibility and hence the scientific validity of a result. Different questions eliciting different answers, as well as semantic equivalence in the eye of the experimenter, are important factors in the practical implementation of reproducibility (see [1] for a vivid example). The core question is no longer "Is this result reproducible?", but "By whom is it reproducible?".

It is often loudly claimed that "extraordinary claims require extraordinary evidence". To me this is either meaningless crap or it boils down to building bias into the scientific method, where bias has no place. I suggest the following instead: "Extraordinary claims require REPRODUCIBLE evidence". Actually, all scientific claims require reproducible evidence. Or I won't believe them, that is. However, introducing double standards into the implementation of the scientific paradigm may be necessary to maintain the scientific status of "big science" projects, which relie on massive "a posteriori" data-filtering (see e.g. how they "modify things a bit" at CERN) and are so costly that they results can hardly be reproduced by uninvolved observers. The introduction of a separate status for "extraordinary" (i.e. non-mainstream) claims allows mainstream science to protect itself by introducing a separate set of rules. Such double standards allow the liquidation of outsiders' claims, without impacting mainstream research, safeguarding its considerable network of economic interests and, perhaps most importantly, its status of ideological flagship.

Monday, June 05, 2006

LIS for Persian nukes

Iran's nuclear program hits the headlines these days, but there is no answer to the crucial question. How far are they from a working nuke?
Everyone who's got Khan's blueprint (which is almost freely available, beside having been presented to the Iranians by its author) can easily build a nuke if he's got enough U-235. With enough reasonably pure U-235 building a nuke is quite easy anyways. So the question is, how long do they need to produce pure-enough U-235 in reasonable multiples of 20kg ?
If they are trying to get there the standard centrifuge-based way, they are at least several years off. They will have to build or obtain thousands of new centrifuges. Then they'll have to operate them for years. It's a complex, costly and very energy intensive operation. Besides, the specifics of the uranium mineral from Iranian mines may require tricky modifications to the standard technology, which may slow down progress substantially, unless they manage to buy better mineral from abroad, which is unlikely. Anyways, even without the latter complication, centrifuge-based uranium enrichment is a large-scale project that cannot be kept secret and is highly vulnerable. There is plenty of time there.

Is that all?
No.
The Iranians claim they have disbanded their laser isotope separation (LIS) project in 2003. LIS is quite high-tech stuff, but it can be implemented as a small-scale operation yielding relevant amounts of U-235 of excellent purity. If you want to hide and are smart enough to master the technology, that's the way to go.
So the key question, as I see it, is now the following. Was the Lashkar Abad LIS pilot facility really dismantled or was it just moved underground?

On the other hand, if, as widely claimed, LIS is really too difficult to be within Iran's technological reach, I surmise that we can sleep peacefully.

IV