Wednesday, January 18, 2017

Finding solace in the Multiverse

A string theorist writes:

In physics we’re not supposed to talk about how we feel. We are a hard-nosed, quantitative, and empirical science. But even the best of our dispassionate analysis begins only after we have decided which avenue to pursue. When a field is nascent, there tend to be a range of options to consider, all of which have some merit, and often we are just instinctively drawn to one. This choice is guided by an emotional reasoning that transcends logic. Which position you choose to align yourself with is, as Stanford University physicist Leonard Susskind says, “about more than scientific facts and philosophical principles. It is about what constitutes good taste in science. And like all arguments about taste, it involves people’s aesthetic sensibilities.” ...

Quarks are permanently bound together into protons, neutrons, and other composite particles. “They are, so to speak, hidden behind a … veil,” Susskind says, “but by now, although no single quark has ever been seen in isolation, there is no one who seriously questions the correctness of the quark theory. It is part of the bedrock foundation of modern physics.”
Notice the verbal trickery here. He does not say that no one questions single quark particles.

Susskind wants you to thing: "No one has ever seen an isolated quark but everyone believes in them anyway."

But that is not right. Lots of physicists do not even believe in particles at all. A quark is a useful fiction that helps to understand the SU(3) theory, but that's all. There is no need to believe that quark particles are real, or that they have colors.


My own research is in string theory, and one of its features is that there exist many logically consistent versions of the universe other than our own. The same process that created our universe can also bring those other possibilities to life, creating an infinity of other universes where everything that can occur, does. ...

The multiverse explains how the constants in our equations acquire the values they do, without invoking either randomness or conscious design. If there are vast numbers of universes, embodying all possible laws of physics, we measure the values we do because that’s where our universe lies on the landscape. There’s no deeper explanation. That’s it. That’s the answer.

But as much as the multiverse frees us from the old dichotomy, it leaves a profound unease. The questions we have spent so long pondering might have no deeper answer than just this: that it is the way it is. That might be the best we can do, but it’s not the kind of answer we’re used to. It doesn’t pull back the covers and explain how something works. What’s more, it dashes the theorists’ dream, with the claim that no unique solution will ever be found because no unique solution exists.

There are some who don’t like that answer, others who don’t think it even qualifies to be called an answer, and some who accept it. ...

Tasneem Zehra Husain is a theoretical physicist and the author of Only The Longest Threads. She is the first Pakistani woman string theorist.
This is the crazy world view of a string theorist. Their god is the elusive mathematical equation that is going to unify all of physics, even if it explains nothing. They believe so much that they will accept the possibility of essentially infinitely many equations driving infinitely many unobservable universes.

Gian Giudice, head of CERN’s theory group, speaks for most physicists when he says that one look at the sky sets us straight. We already know our scale. If the multiverse turns out to be real, he says, “the problem of me versus the vastness of the universe won’t change.” In fact, many find comfort in the cosmic perspective. Framed against the universe, all our troubles, all the drama of daily life, diminishes so dramatically that “anything that happens here is irrelevant,” says physicist and author Lawrence Krauss. “I find great solace in that.”
Great solace?

Even without the multiverse or even anything cosmological, our lives on the surface of the Earth are infinitesimal compared to just the interior of the Earth. That is already enuf to overwhelm his psychological sense of well-being, and I don't believe that anyone really finds any solace in the multiverse.

Monday, January 16, 2017

Quantum gravity from philosophy

From a new paper on Kant and Quantum Gravity:
In quantum gravity space and time lose their status as fundamental parts of the physical reality. However, according to Kant, space and time are the a priori conditions of our experience. Does Kantian characterization of these notions give constraints to quantum gravity, or does quantum gravity make Kantian characterization of space and time an invalid approach? This paper provides answers to these questions with a philosophical approach to quantum gravity.
You are probably going to say that this is self-evidently ridiculous, because physics is about the observable world, not the the prejudices of some silly German philosopher. Kant said a lot of stupid stuff about space and time before relativity, and there is no agreement over whether it can be reconciled with relativity.

But all quantum gravity is just philosophy. There is no data or experiment to guide the theory. Researchers are just looking for "a priori conditions of our experience", whatever that means.

Speaking of philosophy, Stanford philosophy professor Helen Longino said:
Sokal has this very sort of old-fashioned idea about science — that the sciences are not only aiming at discovering truths about the natural world but that their methods succeed in doing so.
Outside the hard sciences, I suspect that it is very common for academics to deny that science discovers truths.

Philosophers complain that they get no respect from the hard sciences. Of course they do not. Philosophers will never get respect as long as they deny the discovery of truths.

SciAm blogger John Horgan writes that the point of philosophy is not to discover truth or even to make progress:
What is philosophy? What is its purpose? Its point? The traditional answer is that philosophy seeks truth. But several prominent scientists, notably Stephen Hawking, have contended that philosophy has no point, because science, a far more competent truth-seeking method, has rendered it obsolete. ...

[David Chalmers] concedes that whereas scientists do converge on certain answers, “there has not been large collective convergence to the truth on the big questions of philosophy.” A survey of philosophers carried out by Chalmers and a colleague revealed divisions on big questions: What is the relationship between mind and body? How do we know about the external world? Does God exist? Do we have free will?

Philosophers’ attempts to answer such questions, Chalmers remarks, “typically lead not to agreement but to sophisticated disagreement.” That is, progress consists less in defending truth claims than in casting doubt on them. Chalmers calls this “negative progress.”
It appears that philosophers making negative progress are jealous of other fields that make positive progress.

Saturday, January 14, 2017

Top modern philosophy books

Want to know what passes as a great modern philosophy book? Here are the top ones with some relation to science (and a couple of others):
Most-cited Anglophone philosophy books published since WWII (according to Google Scholar)

These are rounded to the nearest 100. I tried to find all post-WWII philosophy books with at least 5,000 citations. ...

1. Thomas Kuhn, The Structure of Scientific Revolutions (1962) (89,500)

2. John Rawls, A Theory of Justice (1971) (65,000)

8. Karl Popper, Conjectures and Refutations: The Growth of Scientific Knowledge (1963) (15,700)

14. Jerry Fodor, The Modularity of Mind (1983) (12,800)

16. Daniel Dennett, Consciousness Explained (1991) (11,100)

20. Karl Popper, Objective Knowledge:  An Evolutionary Approach (1972) (10,100)

21. Paul Feyerabend, Against Method (1975) (9,900)
Karl Popper, The Open Society and Its Enemies (1945) (9,500)
John Searle, The Construction of Social Reality (1995) (8,400)
Derek Parfit, Reasons and Persons (1984) (8,200)
David Chalmers, The Conscious Mind (1996) (7,400)
Jerry Fodor, The Language of Thought (1975) (7,300)
Daniel Dennett, Darwin's Dangerous Idea (1995) (7,000)
Martha Nussbaum, Women and Human Development:  The Capabilities Approach (2001) (6,900)
Hubert Dreyfus, Stuart Dreyfus & T. Athanasiou, Mind over Machine (2000) (6.800)
John Searle, Intentionality (1983) (6,500)
Popper occasionally said sensible things, but those are disavowed by modern philosphers.

Where are the philosophers who have contributed to modern scientific understanding? These books have hurt the cause of science more than they are helped.

Monday, January 9, 2017

Isosceles triangles and common sense

Geography professor and popular anthropology writer Jared Diamond writes:
In fact, common sense should be invoked more often in scientific discussions, where it is sometimes deficient and scorned. Scientists may string out a detailed argument that reaches an implausible conclusion contra ...

The proof purported to demonstrate that all triangles are isosceles, i.e. have two equal sides. Of course that conclusion is wrong: most triangles have unequal sides, and only a tiny fraction has two equal sides. ...

The proof tacitly assumed that that perpendicular bisector did intersect the triangle’s base, as is true for isosceles and nearly-isosceles triangles. ...

Conclusion: don’t get bogged down in following the details of a proof, if it leads to an implausible conclusion.
He apparently never understood the flaw, as his reasoning would imply that a nearly-isosceles triange must be isosceles.

Mathematics is all about understanding what is a valid proof, and what is not. Diamond did not get the point.

It figures, as his books are filled with illogical arguments that he takes to be conclusive.

Here is his second example:
This discovery became explained only two decades later by Albert Einstein’s theory of relativity, for which the Michelson-Morley experiment offered crucial support.

Another two decades later, though, another physicist carried out a complicated re-analysis of Michelson’s and Morley’s experiment. He concluded that their conclusion had been wrong. If so, that would have shaken the validity of Einstein’s formulation of relativity. Of course Einstein was asked his assessment of the re-analysis. His answer, in effect, was: “I don’t have to waste my time studying the details of that complex re-analysis to figure out what’s wrong with it. Its conclusion is obviously wrong.” That is, Einstein was relying on his common sense. Eventually, other physicists did waste their time on studying the re-analysis, and did discover where it had made a mistake.
This is a funny one, because the textbooks agree that Michelson-Morley was crucial for the development and demonstration of relativity, but Einstein himself regarded it as unimportant for his contribution. That is because Einstein's work was largely a reformulation of Lorentz's, and Einstein relied on Lorentz's analysis of Michelson-Morley.

The experiment was re-done many times. If Einstein did not care much about the first experiment, why would he bother with the subsequent ones? This is a story of indifference, not common sense.

Diamond ends up arguing that the Clovis ppl were the first settlers of the Americas. I have no idea about that.

Saturday, January 7, 2017

Predicting quantum supremacy for 2017

SciAm reports:
Quantum computing has long seemed like one of those technologies that are 20 years away, and always will be. But 2017 could be the year that the field sheds its research-only image.

Computing giants Google and Microsoft recently hired a host of leading lights, and have set challenging goals for this year. Their ambition reflects a broader transition taking place at start-ups and academic research labs alike: to move from pure science towards engineering.

“People are really building things,” says Christopher Monroe, a physicist at the University of Maryland in College Park who co-founded the start-up IonQ in 2015. “I’ve never seen anything like that. It’s no longer just research.”

Google started working on a form of quantum computing that harnesses superconductivity in 2014. It hopes this year, or shortly after, to perform a computation that is beyond even the most powerful ‘classical’ supercomputers — an elusive milestone known as quantum supremacy. Its rival, Microsoft, is betting on an intriguing but unproven concept, topological quantum computing, and hopes to perform a first demonstration of the technology.

The quantum-computing start-up scene is also heating up. ...

Academic labs are at a similar point.
I am predicting that none of these groups achieve quantum supremacy in 2017, but the year will end with everyone predicting it for 2018. And that pattern will repeat for a few years.
Whereas classical computers encode information as bits that can be in one of two states, 0 or 1, the ‘qubits’ that comprise quantum computers can be in ‘superpositions’ of both at once. This, together with qubits’ ability to share a quantum state called entanglement, should enable the computers to essentially perform many calculations at once. And the number of such calculations should, in principle, double for each additional qubit, leading to an exponential speed-up.
Scott Aaronson likes to say that this explanation is wrong, because quantum computers do not necessarily get an exponential speedup.

These articles do not even cite any skeptics anymore. There is a consensus. They are so far committed that I expect them to refuse to admit that they have been proven wrong for about ten years after they have been proven wrong. And I predict that they will be proven wrong.

Friday, January 6, 2017

The Trouble with Quantum Mechanics

Steven Weinberg writes The Trouble with Quantum Mechanics in the NY Review of Books:
The development of quantum mechanics in the first decades of the twentieth century came as a shock to many physicists. Today, despite the great successes of quantum mechanics, arguments continue about its meaning, and its future. ...

Probability enters Newtonian physics only when our knowledge is imperfect, ...

Many physicists came to think that the reaction of Einstein and Feynman and others to the unfamiliar aspects of quantum mechanics had been overblown. This used to be my view. After all, Newton’s theories too had been unpalatable to many of his contemporaries. ...

It is a bad sign that those physicists today who are most comfortable with quantum mechanics do not agree with one another about what it all means. The dispute arises chiefly regarding the nature of measurement in quantum mechanics. ...

The introduction of probability into the principles of physics was disturbing to past physicists, but the trouble with quantum mechanics is not that it involves probabilities. We can live with that. The trouble is that in quantum mechanics the way that wave functions change with time is governed by an equation, the Schrödinger equation, that does not involve probabilities. It is just as deterministic as Newton’s equations of motion and gravitation. That is, given the wave function at any moment, the Schrödinger equation will tell you precisely what the wave function will be at any future time. There is not even the possibility of chaos, the extreme sensitivity to initial conditions that is possible in Newtonian mechanics. So if we regard the whole process of measurement as being governed by the equations of quantum mechanics, and these equations are perfectly deterministic, how do probabilities get into quantum mechanics? ...

What then must be done about the shortcomings of quantum mechanics? One reasonable response is contained in the legendary advice to inquiring students: “Shut up and calculate!” There is no argument about how to use quantum mechanics, only how to describe what it means, so perhaps the problem is merely one of words.
This is a very strange complaint. Obviously he understands perfectly well how probability, measurement, and chaos get into quantum mechanics, because there is wide agreement on how to do the calculations that predict experiments.

So his problem is purely philosophical. If this is a problem, then I think that most theories have problems if you take them too literally and ask too many philosophical questions.

LuMo explains:
When it comes to well-defined laws governing the evolution of probabilities in time, it's just a plain stupidity to suggest that these laws are "troubling" in any sense. Quantum mechanics doesn't change anything about the meaning of probabilities, their relationship to imperfect knowledge, the existence of well-defined equations by which these probabilities evolve as well as discontinuous Bayesian/collapse changes by which the probabilities jump after an observation. The only thing that is new in quantum mechanics is the uncertainty principle (due to the nonzero commutators) which, among other things, forbids any perspective in which two generic observables are perfectly known at the same moment. The nonzero commutators make it unavoidable to talk about probabilities between 0% and 100% i.e. about imperfect knowledge. But what the "knowledge", "imperfect", "probability" etc. mean is exactly the same as it always was. ...

At any rate, I consider Weinberg to be a 100% anti-quantum zealot ... at this point. It's sad.
Weinberg's hangup about probabilities is especially strange. He says that probabilities enter classical mechanics "when our knowledge is imperfect", and enters quantum mechanics because "not everything can be simultaneously measured." Okay, I can accept that, but why is it a problem? Our knowledge is always imperfect in the classical case because of observation errors, and always imperfect in the quantum case for the additional reason that it is impossible to predict the measurement of variables that cannot be simultaneously measured. So yes, probabilities are appropriate in either case.

I can only infer that Weinberg has some conceptual misunderstanding of probability, but I don't see what it is.

He favorably describes the many-worlds interpretation, but does not endorse it.

Physics professor Frank Tipler does endorse the many worlds:
Most physicists, at least most physicists who apply quantum mechanics to cosmology, accept Everett’s argument. So obvious is Everett’s proof for the existence of these parallel universes, that Steve Hawking once told me that he considered the existence of these parallel universes “trivially true.” Everett’s insight is the greatest expansion of reality since Copernicus showed us that our star was just one of many. Yet few people have even heard of the parallel universes, or thought about the philosophical and ethical implications of their existence.
Quantum mechanics is a theory of physics on an atomic scale, so only crackpots apply quantum mechanics to cosmology. Maybe most of them believe in many-worlds, I don't know, but I really don't think that most physicists do.

Thursday, January 5, 2017

Blaming Einstein for the name Relativity

Jim Holt writes:
In physics, as Emmy Noether showed us with her beautiful theorem, invariance turns out to entail the conservation of energy and other bedrock conservation principles — "a fact," noted Richard Feynman, "that most physicists still find somewhat staggering."    

And in the mind of Albert Einstein, the idea of invariance led first to e = mc2, and then to the geometrization of gravity.   

So why aren't we hearing constantly about Einstein's theory of invariance? Well, "invariant theory" is what he later said he wished he had called it. And that's what it should have been called, since invariance is its very essence. The speed of light, the laws of physics are the same for all observers. They're objective, absolute — invariant. Simultaneity is relative, unreal.  

But no. Einstein had to go and talk about the "principle of relativity." So "relativity"—and not its opposite, "invariance"—is what his revolutionary theory ended up getting labeled.
No, Einstein did not conceive the geometrization of gravity. As a recent paper noted:
It is generally believed that Einstein identified gravitation with the non-Euclidean geometry of spacetime. However, contrary to common belief, as Lehmkuhl showed [7], Einstein himself did not believe that general relativity geometrized gravitation: "I do not agree with the idea that the general theory of relativity is geometrizing Physics or the gravitational field" [8].

[7] D. Lehmkuhl, Why Einstein did not believe that General Relativity geometrizes gravity. Studies in History and Philosophy of Physics, Volume 46, May 2014, pp. 316-326.
[8] A letter from Einstein to Lincoln Barnett from June 19, 1948; quoted in [7].
Einstein also did not originate the terms "relativity" or "principle of relativity". As well as I can determine, Maxwell invented the term "relativity", and Poincare popularized the "principle of relativity", long before Einstein ever wrote anything on the subject.

In short, relativity was named by those who invented it, not Einstein.

Holt likes "invariance" because it suggests a group action, but Einstein missed that. The group is called the "Lorentz group", because that is what Poincare called it in a 1905 paper that was published before Einstein submitted his famous relativity paper. Poincare named it after Lorentz, who did pioneering work on it ten years earlier.

I don't see why "invariance" is a better name anyway. The theory's origin was with attempts to understand the motion of the Earth, with a key observation being that the motion we see is relative to the Earth's frame of reference. The theory shed new light on an ancient and easy-to-understand question. Getting to invariants is a little more obscure.

Tuesday, January 3, 2017

The future is uncertain

Professor Anthony Sudbery writes in Aeon mag, as an intro to his forthcoming book:
Aristotle formulated the openness of the future in the language of logic. Living in Athens at a time when invasion from the sea was always a possibility, he made his argument using the following sentence: ‘There will be a sea-battle tomorrow.’ One of the classical laws of logic is the ‘law of the excluded middle’ which states that every sentence is either true or false: either the sentence is true or its negation is true. But Aristotle argued that neither ‘There will be a sea-battle tomorrow’ nor ‘There will not be a sea-battle tomorrow’ is definitely true, for both possibilities lead to fatalism; if the first statement is true, for example, there would be nothing anybody could do to avert the sea-battle. Therefore, these statements belong to a third logical category, neither true nor false. In modern times, this conclusion has been realised in the development of many-valued logic. ...

Aristotle formulated the openness of the future in the language of logic. Living in Athens at a time when invasion from the sea was always a possibility, he made his argument using the following sentence: ‘There will be a sea-battle tomorrow.’ One of the classical laws of logic is the ‘law of the excluded middle’ which states that every sentence is either true or false: either the sentence is true or its negation is true. But Aristotle argued that neither ‘There will be a sea-battle tomorrow’ nor ‘There will not be a sea-battle tomorrow’ is definitely true, for both possibilities lead to fatalism; if the first statement is true, for example, there would be nothing anybody could do to avert the sea-battle. Therefore, these statements belong to a third logical category, neither true nor false. In modern times, this conclusion has been realised in the development of many-valued logic. ...

Knowledge of the future, therefore, is limited in a fundamental way. It is not that there are true facts about the future, but the knowledge of them is not accessible to us; there are no facts out there, and there is simply no certain knowledge to be had. Nevertheless, there are facts about the future with partial degrees of truth. We can attain knowledge of the future, but that knowledge will always be uncertain.
His explanation is sensible enuf, but it is funny how he has to dive into quantum mechanics to reach the same conclusions that Aristotle reached 2.5 millennia earlier.

Yes, Laplace mentioned a deterministic fantasy in 1814, but he also had a very probabilistic view of the world.

Monday, January 2, 2017

Awe for the Second Law of thermo

Harvard psychology professor (and popular Jewish atheist science writer) Steven Pinker writes:
Why the awe for the Second Law? The Second Law defines the ultimate purpose of life, mind, and human striving: to deploy energy and information to fight back the tide of entropy and carve out refuges of beneficial order. ...

The biggest breakthrough of the scientific revolution was to nullify the intuition that the universe is saturated with purpose: that everything happens for a reason. In this primitive understanding, when bad things happen — accidents, disease, famine — someone or something must have wanted them to happen.
So science eliminated all these diverse purposes for bad things, and unified them into one central purpose that explains them all?

This is like ancient Judaism discovering that river gods, weather gods, and all the other gods could be combined into one true God. Or early scientists discovering that all known energy sources could be trace to the one Sun.

Scott Aaronson endorses Pinker's essay and adds:
Again and again, people imagine that, if their local pocket of order isn’t working how they want, then they should smash it to pieces, since while admittedly that might make things even worse, there’s also at least 50% odds that they’ll magically improve.  In reasoning thus, people fail to appreciate just how exponentially more numerous are the paths downhill, into barbarism and chaos, than are the few paths further up.  So thrashing about randomly, with no knowledge or understanding, is statistically certain to make things worse: on this point thermodynamics, common sense, and human history are all in total agreement.
He doesn't realize it, but he is making an argument for political conservatism. The policies of Barack Obama and Hillary Clinton have led us downhill into barbarism and chaos. We need a government that is willing to go back to what has worked before, and make America great again.

Wednesday, December 28, 2016

Flaws in Bohmian mechanics

LuMo rightly trashes Bohm's quantum mechanics:
In the first sentence, Bohmian mechanics is promoted as "one interpretation that manages to skip... all the mysterious ideas". This is, of course, rubbish. The thing that Bohmian mechanics skips is that the world is quantum mechanical, not classical. And this "skipping" is a fundamental and lethal flaw, not a virtue, of Bohmian mechanics because it's the quantum mechanical nature of our theories that is absolutely needed to get an agreement between the reality and the experiment. It's been needed for more than 90 years. It's a long enough period of time for people to notice.

Moreover, while Bohmian mechanics is a classical theory, it in no way "skips" bizarre features. In particular, Bohmian mechanics has to introduce straight non-local influences – which are really voodoo and have been known to be prohibited by the 1905 theory of relativity. Also, it contains new classical waves that spread and their number and dilution is constantly getting out of control. A "janitor" that would clean all this mess – the spreading omnipresent wave functions that are no longer needed for any predictions and won't be observed – would be badly needed.
In spite of this, the theory still has defenders, such as this Quora article:
Physicists today remain largely unaware of the fact that quantum mechanics is perfectly choreographed by the mathematics of the de Broglie-Bohm theory, otherwise known as Bohmian mechanics. Despite the fact that Bohm’s formalism is entirely deterministic, and less vague than the standard interpretation of quantum mechanics, so far it has only been widely recognized and embraced among philosophers of physics. ...

To dive right in, let us note that in addition to the Schrödinger equation, which is shared among all quantum mechanical interpretations, Bohmian mechanics[1] is completed by the specification of actual particle positions, which evolve (in configuration space) according to the guiding equation. This combination elegantly restores determinism into the dynamics of physical reality; accounting for all the phenomena governed by nonrelativistic quantum mechanics—from spectral lines and scattering theory to superconductivity, the quantum hall effect, quantum tunneling, nonlocality, and quantum computing.

On top of this, Bohm’s theory magnificently elucidates state evolution without elevating the role of the observer to something mystical.[2] This reveals that the stochastic property of the orthodox approach of quantum mechanics, which manifests in state vector reduction, is merely a reflection of the incompleteness of that approach.[3] ...

Bohm’s model has been praised as a cure to the conceptual difficulties that have plagued quantum mechanics because it elegantly does away with much of the subjectivity and vagueness found in the standard approach. Despite this, mainstream physicists haven’t embraced this interpretation, or examined it in depth. In fact, the large majority of them haven’t even heard of it. This is embarrassing, surprising and frustrating.[5] If Bohmian mechanics provides a cure to modern quantum mechanical philosophic complacency, then why have there been so few to study the richness of this elegant formalism? ...

Bohm’s model has been praised as a cure to the conceptual difficulties that have plagued quantum mechanics because it elegantly does away with much of the subjectivity and vagueness found in the standard approach. Despite this, mainstream physicists haven’t embraced this interpretation, or examined it in depth. In fact, the large majority of them haven’t even heard of it. This is embarrassing, surprising and frustrating.[5] If Bohmian mechanics provides a cure to modern quantum mechanical philosophic complacency, then why have there been so few to study the richness of this elegant formalism?
David Bohm was a Jewish Communist ex-American with peculiar beliefs and a cult following.

Some ppl claim that Bohmian mechanics is equivalent to quantum mechanics, at least for very simple systems. So you can believe in it if you wish. Where they lose me is when they claim that it is more intuitive, a causal interpretation, and explains the nonlocality of nature.

These claims are all absurd. Conventional physics, including quantum mechanics, is local and causal. Believing in Bohm's theory is like ghosts and magic, with electrons have weird nonlocal effects with no causal explanation.

Science has always been able to give causal explanations for events. Bohm abandons this for silly philosophical reasons, not because of any observations or weaknesses in theory.

Wednesday, December 21, 2016

No climate science evidence needed

Here is the current Rationally Speaking podcast:
Over the last two decades, the Evidence-Based Medicine (EBM) movement has transformed medical science, pushing doctors to rely less on intuition or "common wisdom" in choosing treatments, and more on evidence from studies. Sounds great -- but has EBM become a victim of its own success? This episode features John Ioannidis, Stanford professor of medicine, health and policy, and statistics, and author of the famous paper, "Why Most Published Research Findings are False." John and Julia discuss how EBM has been "hijacked," by whom, and what do do about it.
Ioannidis makes a lot of great points about the fallibility of research papers in medicine and social sciences, but he loses me with this:
There's some things in science that we're very, very close to 100% certain about them. It's like 99.999% — like climate change and the fact that humans are making a difference in that regard, or smoking is killing people. It will kill a billion people in the next century unless we do something.

It's 99.999%. I think that it makes a huge difference, compared to pseudoscience claims that are “100% correct” and there's no way that you can reach a different conclusion, in that we're always open to evidence, and open to understanding what that evidence means.

I don't think we need more evidence about smoking and about climate change. I think that we've had enough.
Really? No need for more evidence?

I get worried when our most extreme skeptics refuse to be skeptical about some things. Even assuming that humans are making a difference in the climate (which is probably correct), we need a lot more evidence before we can adopt reasonable policies. We need more smoking evidence also. This opinion is strange.

Monday, December 19, 2016

Purpose versus scientific worldview

Robert Wright writes in the NY Times philosophy column:
You could call these the “Three Great Myths About Evolution and Purpose.”

Myth number one: To say that there’s in some sense a “higher purpose” means there are “spooky forces” at work.

When I ask scientifically minded people if they think life on earth may have some larger purpose, they typically say no. If I ask them to explain their view, it often turns out that they think that answering yes would mean departing from a scientific worldview — embracing the possibility of supernatural beings or, at the very least, of immaterial factors that lie beyond scientific measurement. ...

Myth number two: To say that evolution has a purpose is to say that it is driven by something other than natural selection.

The correction of this misconception is in some ways just a corollary of the correction of the first misconception, but it’s worth spelling out: Evolution can have a purpose even if it is a wholly mechanical, material process — that is, even if its sole engine is natural selection. After all, clocks have purposes — to keep time, a purpose imparted by clockmakers — and they’re wholly mechanical. Of course, to suggest that evolution involves the unfolding of some purpose is to suggest that evolution has in some sense been heading somewhere — namely, toward the realization of its purpose. Which leads to:

Myth number three: Evolution couldn’t have a purpose, because it doesn’t have a direction.

The idea that evolution is fundamentally directionless is widespread, in part because one great popularizer of evolution, Stephen Jay Gould, worked hard to leave that impression.
This is heresy to modern evolutionists and philosophers of science.

Aristotle and Darwin talked about purposes as central to scientific study, but modern scientific dogma is that nothing has a purpose. Furthermore, any talk of purpose is just a sneaky way to infect Science with God, and thus must be resisted as unconstitutional and contrary to the Scientific Revolution.

Here are Aristotles four causes:
Aristotle held that there were four kinds of answers to "why" questions (in Physics II, 3, and Metaphysics V, 2):[2][4][5]

Matter: a change or movement's material "cause", is the aspect of the change or movement which is determined by the material that composes the moving or changing things. For a table, that might be wood; for a statue, that might be bronze or marble.

Form: a change or movement's formal "cause", is a change or movement caused by the arrangement, shape or appearance of the thing changing or moving. Aristotle says for example that the ratio 2:1, and number in general, is the cause of the octave.

Agent: a change or movement's efficient or moving "cause", consists of things apart from the thing being changed or moved, which interact so as to be an agency of the change or movement. For example, the efficient cause of a table is a carpenter, or a person working as one, and according to Aristotle the efficient cause of a boy is a father.

End or purpose: a change or movement's final "cause", is that for the sake of which a thing is what it is. For a seed, it might be an adult plant. For a sailboat, it might be sailing. For a ball at the top of a ramp, it might be coming to rest at the bottom. ...

Explanations in terms of final causes remain common in evolutionary biology.[23][24] It has been claimed that teleology is indispensable to biology since the concept of adaptation is inherently teleological.[24] In an appreciation of Charles Darwin published in Nature in 1874, Asa Gray noted "Darwin's great service to Natural Science" lies in bringing back Teleology "so that, instead of Morphology versus Teleology, we shall have Morphology wedded to Teleology".
Sure, I don't mind saying that the purpose of a clock is to keep time, or that sailing is the purpose of a sailboat.

Maybe that is just unimportant semantics. But what about ppl who act with purpose? That view seem essential to understanding humans, but there are many big-shot professors who claim that this is wrong, and that we are all mindless automatons.

Update: Massimo Pigliucci offers his own rebuttal, and adds:
Regardless, I think Wright makes a very good point when he writes: “When an argument for higher purpose is put this way — that is, when it doesn’t involve the phrase ‘higher purpose’ and, further, is cast more as a technological scenario than a metaphysical one — it is considered intellectually respectable. … Yet the simulation hypothesis is a God hypothesis … Theology has entered ‘secular’ discourse under another name.”

Saturday, December 17, 2016

QC progresses, except for market and qubits

AAAS Science mag has an article (2 DECEMBER 2016 • VOL 354 ISSUE 6316 p 1091) about the many companies pour 100s of millions of dollars into quantum computer research, promising results real soon now:
One thing is certain: Building a quantum computer has gone from a far-off dream of a few university scientists to an immediate goal for some of the world’s biggest companies.
Here is the explanation of QC, which Scott Aaronson would hate:
QUBITS OUTMUSCLE classical computer bits thanks to two uniquely quantum effects: superposition and entanglement. Superposition allows a qubit to have a value of not just 0 or 1, but both states at the same time, enabling simultaneous computation.

Entanglement enables one qubit to share its state with others separated in space, creating a sort of super-superposition, whereby processing capability doubles with every qubit. An algorithm using, say, five entangled qubits can effectively do 25, or 32, computations at once, whereas a classical computer would have to do those 32 computations in succession. As few as 300 fully entangled qubits could, theoretically, sustain more parallel computations than there are atoms in the universe.
This is the way most ppl explain, but Aaronson complains that it is wrong. He says that it leads ppl to overestimate what quantum computers can do.

Buried in all the hype is a couple of admissions:
They say they have a mutual interest in publicizing their advances, not least so that potential customers can think about how they could use a quantum computer. “We all need a market,” Monroe says.

What’s more, nobody knows enough about quantum computing yet to go it alone with a single qubit type. Every approach needs refining before quantum computers can be scaled up.
In other words, there is no market for quantum computers, and the researchers have not even created one scalable qubit.

Update: I was also amused to see the article claim that 50 qubits may be needed to demonstrate quantum supremacy, and Google is claiming that it will soon get 49 qubits. My guess is that Google secretly realizes that it is not getting quantum supremacy.

Friday, December 16, 2016

Comic about quantum computing misconception

Scott Aaronson announces an SMBC cartoon about quantum computing.

Scott tries to explain quantum computing as not really a matter of qubits being 0 and 1 at the same time, but rather probabilities being negative or complex, and interfering.

There is some merit to what he says. Schroedinger's cat is not really alive and dead at the same time. If qubits could be 0 and 1 simultaneously, then we would expect exponential speedup in certain search algorithms, and we do not.

Probabilities are not really negative, so he is careful to say "probability amplitudes", but he wants you to think of them as analogous to classical probabilities.

No, this is just nonsense.

It is a little more accurate to say that the qubit is a superposition of 0 and 1. And that measuring the qubit can give 0 or 1. But to the layman, that is just like being 0 and 1 at the same time.

Many physicists explain quantum computing as using superpositions to do simultaneous computations.

Sequentially operating on bits having 0 or 1 values gives us Turing machines, like all known computers. Operating on qubits that are superpositions of 0 and 1 is supposed to give us quantum supremacy, and faster computers for certain types of computations.

Scott says the core of the quantum voodoo is amplitude interference. But all sorts of classical phenomena have interfering waves, and that is not particularly mysterious. It only becomes mysterious when you think of those amplitudes as probabilities or generalized probabilities.

Saying that they are probabilities is just a sneaky way of pretending that the qubit really is a 0 or 1 (or both). The qubit is not a 0 or 1 or a probability. A measurement gives a 0 or 1, and we can give a probability based on our knowledge of how the system was set up, but that is not what the qubit is. It is not an amplitude either. The amplitude is just a way of codifying prior knowledge.

We have no numerical equivalent for the state of a qubit.

Scott concedes that quantum supremacy has never been demonstrated and we do not know whether or not it is possible. He sure is opinionated about something that may not exist.

LuMo likes the comic, altho he cannot resist some cheap shots. He agrees that complex numbers are fundamental to quantum mechanics, because [x,p] = xp-px is anti-symmetric, and hence has imaginary eigenvalues.

This argument is unpersuasive. Lots of real matrices have that property, such as the 2x2 matrix [(0,1),(-1,0)]. (Sorry I am not using mathjax.) [x,p] is not directly observable, so the imaginary eigenvalues pose no problems. It implies the uncertainty principle, whether using real or complex numbers.

LuMo buys into Scott's line almost verbatim:
The wave functions are closer to probabilities but they're not quite the usual probabilities. Instead, they're probability amplitudes which are complex and also have the ability to constructively or destructively interfere. When one is observing anything, the amplitudes are converted to the usual probabilities only. But when no one is looking, the probability amplitudes evolve as a new entity according to new rules that have no counterparts in classical physics.
Sorry, but this is just not helpful. If you are doing classical mechanics, such as predicting the location of the Moon in 1000 years, you compute probabilities. The Moon's position has probabilities that evolve with time. Observations tell us where it really is (to within measurement error). All this talk of probabilities as being unique to quantum mechanics is misleading.

So is talk of probability amplitudes interfering. Classical waves interfere also. Probabilities do not really interfere in either classical or quantum mechanics.
There's no "splitting of the worlds" during a quantum computation. On the contrary, the splitting of the worlds may only make sense after a measurement which can only occur after decoherence – but the quantum computation depends on the absence of any decoherence (I will make the same observation again later). So a key necessary condition for the quantum computer to work – and do some things that are practically impossible on classical computers – is that there's no decoherence and no splitting of the world during the calculation.
LuMo is right for the wrong reason. I say that there is never any splitting of the worlds, and never any quantum computation.

If a quantum supremacy computation did exist, it would have to somehow take advantage of a qubit staying in a superposition, and not decohering. In the lingo of many-worlds, the qubit has to straddle different worlds. Supposedly quantum supremacy is possible because qubits can leave this world and do some of its computation in a parallel world. I am not buying it, but that is the theory.

Thursday, December 15, 2016

Essays: Wandering Towards a Goal

FQXi announces:
the next $40,000 FQXi essay contest,...

This year’s theme is: Wandering Towards a GoalHow can mindless mathematical laws give rise to aims and intentions?

One way to think of physics is as a set of mathematical laws of dynamics. These laws provide predictions by carrying conditions at one moment of time inexorably into the future. But many phenomena admit another description – sometimes a vastly more useful one – in terms of long-term, large-scale goals, aims, and intentions. ...

Relevant essays might address questions such as:

* How did physical systems that pursue the goal of reproduction arise from an a-biological world?

* What general features — like information processing, computation, learning, complexity thresholds, and/or departures from equilibrium — allow (or proscribe) agency?

* How are goals (versus accomplishments) linked to “arrows of time”?

* What separates systems that are intelligent from those that are not? Can we measure this separation objectively and without requiring reference to humans?

* What is the relationship between causality – the explanation of events in terms of causes – and teleology – the explanation of events in terms of purposes?

* Is goal-oriented behavior a physical or cosmic trend, an accident or an imperative?

We are accepting entries from now until March 3, 2017, with winners announced in June.
I submitted some essays to previous contests.