Monday, April 6, 2020

What is Mathematics?

A Math blog asks what is mathematics, and only provides silly answers:
Any argument carried out with sufficient precision.

Using arguments with more than two steps.

Mathematics is what mathematicians do.

Mathematics is the branch of natural philosophy that concerns itself with only making true statements.

Mathematics ought to be considered as a set of precise, symbolic, languages that serves as a lingua franca for the physical sciences.
All of these answers are unsatisfactory.

Mathematics is knowledge obtained by logical proofs.

Saying that mathematics is a language is like saying music or philosophy is a language. Sure they use language to communicate, but so does everyone else.

Using phrases like "sufficient precision" ignores the fact that some arguments are proofs, and some are not. Math demands proofs.

Saying "true statements" comes the closest to describing math, but of course many other fields claim to be finding truth. Only math finds it with logical proofs.

Monday, March 30, 2020

The quantum arrow of causality

A new paper argues that Quantum causality determines the arrow of time:
Eddington introduced the concept of the arrow of time - the one way flow of time as events develop and our perceptions evolve. He pointed out that the origin of such an arrow appears to be a mystery in that the underlying laws of physics (at least at the time of Eddington) are time symmetric and would work equally well if run in the reverse time direction. The laws of classical physics follow from the minimization of the action and are indeed time symmetric. This view has been beautifully captured by Carlo Rovelli [1], who writes: "The difference between past and future, between cause and effect, between memory and hope, between regret and intention...in the elementary laws that describe the mechanisms of the world, there is no such difference." But what picks out only those solutions running forward in time?

By now, there is a large literature on the arrow of time [1-10]. Essentially all of the literature accepts the proposition that the fundamental laws of physics do not distinguish between past and future and could equally well be run backwards. There is also a recognition that the second law of thermodynamics does distinguish between these directions as it states that entropy cannot decrease in what we refer to as the future. This leads to the idea of a thermodynamic arrow of time. Many view this thermodynamic arrow as the origin of the passage of time, or at least of our consciousness of that passage.

Our point in this paper is that the basic premise of such reasoning is not valid in quantum theory. Quantum physics in its usual form has a definite arrow of causality - the time direction that causal quantum processes occur. ...

The basic statement saying that the fundamental laws of physics do not differentiate an arrow of time is not correct. At the microscopic level, reactions run in one direction but not the other.
I think this paper is correct in that (1) a common view today is that the thermodynamic arrow of time is the only one; and (2) this view is mistaken.

Thursday, March 26, 2020

The joy of quantum computation

Scott Aaronson, aka Dr. Quantum Supremacy, writes:
what potential use of quantum computers brings you the most joy?

As I’ve said in dozens of my talks, the application of QC that brings me the most joy is refuting Gil Kalai, Leonid Levin, and all the others who said that quantum speedups were impossible in our world.
Okay, quantum computers have the potential of proving us QC skeptics wrong. Is that what he is saying?

If he had already proved us wrong, then he would presumably say so, and celebrate his joy.

He has already endorsed Google claim of "quantum supremacy", but he uses the term "quantum speedups here. The Google computation was not really a speedup of anything, so I guess he is admitting that no quantum speedups have been achieved.

The Google computation was one that would be hard to simulate on a Turing computer. I don't think anyone denied that some quantum experiments could be hard to simulate. The question is whether some quantum computer can get some huge speedup in solving some problem, over what a Turing computer can do.

When Google or someone else demonstrates a quantum speedup, I will be glad to bring Scott the joy of being right about it.

Monday, March 23, 2020

Poincare bio essay credits Einstein

One of the strange things about Albert Einstein is how everyone goes out of his way to credit him, whether appropriate or not.

For example, you would think that an essay about Poincare, or any other scientist, would just mention what he did, and not bother crediting others who are not the subject of the essay. That is usually the case, except when it relates to Einstein.

If any essay mentions someone's contribution to relativity, it always makes a point of saying that it was inferior to Einstein's contribution. I don't think I have ever seen any exception.

Here is Henri Poincaré: a short intellectual biography:
Poincaré was also interested in the philosophical debate on time measurement, stemming from practical work on determination of longitude and his and scientific interest in electromagnetic and optical phenomena. He introduced the idea of relative motion and the rejection of Newtonian absolute time by recognising that the measurement of duration or simultaneity involves the introduction of convention. Many have been fascinated about the relationship between Poincaré and Einstein and how much the special theory of relativity was conceived by the former. His 1906 lectures On the Limits of the Law of Newton engages with the laws of relativity. While his discovery of the Lorentz group made the ether irrelevant, Poincaré's search for dynamical solution to the observed Lorentz contractions and dilations prevented him from going as far as Einstein into the theory of relativity.
Note the last sentence says Poincare's relativity was inferior to Einstein's.

The sentence doesn't even make any sense. Einstein said he looked for a dynamical solution to the contractions. The dynamical explanation was Lorentz's, not Poincare's.

Einstein didn't really get any farther into relativity while Poincare was alive. Einstein did get farther after Poincare died. But even if Einstein did discover something that Poincare didn't know, why is it so important to mention? The essay mentions many other Poincare contributions, but does not describe any of them as inferior to those of others.

I believe that this is the result of the influence of an Einstein cult. It is harder to say why Einstein is worshiped so much.

Friday, March 20, 2020

Positivist perspective on predictability

I submitted an essay to the FQXi essay contest, I mentioned before. You can comment on it there. The contest deadline was extended a week, so you can still submit an entry.

Thursday, March 19, 2020

We cannot calculate human decisions

Sabine Hossenfelder writes:
This means that large objects - basket balls, cucumbers, you - behave according to the same laws as elementary particles. Even though no one in their right mind would use the Standard Model of particle physics to predict the growth of a cucumber, there is nothing, in principle, standing on the way of doing so. Given a sufficiently large and powerful computer, even human decisions could be calculated in finite time3. The consequence is that, according to our best current knowledge, the future is already determined, up to the occasional random interference from quantum events,4. And really all of science is just a sloppy version of physics.

[footnote] 4 What you think this implies for the existence of free will depends on your definition of free will. Personally, I would argue this means free will does not exist in any sensible definition of the term, though, of course, no one is forced to use a sensible definition. But regardless of what you think free will means, the statement that the future is determined up to quantum fluctuations remains correct.
I don't know how anyone can believe this.

She says that the quantum-mechanical Standard Model is the fundamental theory that explains everything, so why does she refer to quantum events as "occasional random interference"? All we are is a bunch of quantum events.

This is a very strange view. It sounds as if she believes the true laws of physics as non-quantum, and quantum mechanics is just making it sloppy. Her footnote says that this refutes any sensible notion of free will. So do she think humans are governed by physical laws that do not include quantum mechanics?

I know from her blog that she doesn't even believe in quantum randomness, because she subscribe to superdeterminism. So why even bring it up? Why not stick to your superdeterminism beliefs?

I get the impression that there is a good and holy God who brings order to the universe, in the form of deterministic laws, and there is an evil Devil who keeps corrupting it by injecting quantum uncertainties.

Maybe I shouldn't pick on her, because a lot of other physicists probably have similar views. Very few subscribe to superdeterminism, but a lot probably view quantum events as some sort of devil's work. Weird.

Monday, March 16, 2020

Professor famous for denying objective truth

SciAm blogger John Horgan writes :
I'll start with Kuhn. He is the philosopher of science who argued, in his 1962 book The Structure of Scientific Revolutions, that science can never achieve absolute, objective truth. Reality is unknowable, forever hidden behind the veil of our assumptions, preconceptions and definitions, or “paradigms.” At least that’s what I thought Kuhn argued, but his writings were so murky that I couldn’t be sure. ...

This was typical of how Kuhn spoke. As if to demonstrate his own views on how language obfuscates, he endlessly qualified his own statements. He seemed incapable of saying something in an unambiguous way. But what he was saying was that, even when it came to a question as seemingly straightforward — and vitally important! -- as whether HIV causes AIDS, we cannot say what the “truth” is. We can’t escape interpretation, subjectivity, cultural context, and hence we can never say whether a given claim is objectively right or wrong.

I call this perspective extreme postmodernism. ...

If anything, right-wing postmodernism became even more virulent after Obama’s election in 2008, ... 60 million Americans were infected by swine flu, 274,000 were hospitalized and 12,469 died, according to the CDC.

What can be done about the problem of right-wing postmodernism? I honestly don’t know.
The 2009 flu pandemic was declared an emergency by President Obama and the UN WHO. It will be interesting to compare it to the current Wuhan virus disease, aka COVID-19. This one is causing considerably more panic, and will probably kill a lot more people.

I am pretty sure that the large majority of the Kuhnian paradigm shifters are leftists, not right-wingers. Horgan wants to blame Pres. Trump, but I very much doubt that Trump would even recognize paradigm shifter thinking.

Good luck with the health precautions. I have no special wisdom on the subject.

For a current example of leftist denial of objective truth, see this Angela Saini essay in Nature magazine.
One of the world’s leading universities — University College London (UCL) — has completed an inquiry into its support for the discredited pseudoscience of eugenics. Funds linked to Francis Galton, a racist who believed it was possible to improve the British population through selective breeding, and who founded the Eugenics Records Office at UCL in 1904, continue to line the university’s coffers to the value of more than £800,000 (US$1 million).

The inquiry’s report, released on 28 February, recommended renaming lecture theatres and buildings bearing Galton’s name and that of another prominent geneticist. Although this is welcome, it does not acknowledge just how much yesterday’s mistakes survive in modern science.
She denies that there is any objective way to determine whether Galton's ideas are true or not, but wants to ostracize him for political reasons.
At a philosophy festival last September, I spoke about non-European cultures and their contributions to science and mathematics. One scientist remarked that he had no need to know about what had been done in ‘bongo bongo’ land.
So why would he need to know that?

She is obviously an anti-White-male leftist who denies objectivity so she can promote her own political agenda, such as making everyone learn about science done by Indian women, or some such foolishness.

She is also criticized here.

Friday, March 13, 2020

Trying to make sense of superdeterminism

Sabine Hossenfelder writes:
It does not help that most physicists today have been falsely taught the measurement problem has been solved, or erroneously think that hidden variables have been ruled out. If anything is mind-boggling about quantum mechanics, it’s that physicists have almost entirely ignored the most obvious way to solve its problems.
Her "obvious way" is to replace quantum mechanics with a superdeterministic theory of hidden variables. It says that all mysteries are resolved by saying that they were pre-ordained at the beginning of the big bang.

Peter Shor asks in reply:
I don't see how superdeterminism is compatible with quantum computation.

Suppose we eventually build ion trap quantum computers big enough to factor a large number. Now, suppose I choose a random large number by taking a telescope in Australia and finding random bits by looking at the light from 2000 different stars. I feed these bits into a quantum computer, and factor the number. There's a reasonable likelihood that this number has some very large factors.

Just where and how was the number factored?
I agree with him that it is impossible to believe in both superdeterminism and quantum computation.

Quantum computation cleverly uses the uncertainties about quantum states to do a computation, such as factoring a large integer. If superdeterminism is true, then there aren't really any uncertainties in nature. What seems random is just our lack of knowledge.

If I could make a machine with 1000 qubits, and each can be in 2 states at once, then it seems plausible that a qubit calculation could be doing multiple computations at once, and exceeding what a Turing machine can do. (Yes, I know that this is an over-simplification of which Scott Aaronson, aka Dr. Quantum Supremacy, disapproves.)

But is the uncertainty is all an illusion, then I don't see how it could be used to do better than a Turing machine.

I don't personally believe in either superdeterminism or quantum computation. I could be proved wrong, if someone makes a quantum computer to factor large integers. I don't see how I could be proved wrong about superdeterminism. It is a fringe idea with only a handful of followers, and it doesn't solve any problems.

Update: Hossenfelder and Shor argue in the comments above, but the discussion gets nowhere. In my opinion, the problem is that superdeterminism is incoherent, and so contrary to the scientific enterprise that it is hard to see why anyone would see any value in it. Shor raises some objections, but discussing the issue is difficult because superdeterminists can explain away anything.

Monday, March 9, 2020

Trying to prove many-worlds from first principles

Mordecai Waegelly and Kelvin J. McQueenz write Reformulating Bell’s Theorem: The Search for a Truly Local Quantum Theory:
The apparent nonlocality of quantum theory has been a persistent concern. Einstein et. al. (1935) and Bell (1964) emphasized the apparent nonlocality arising from entanglement correlations. While some interpretations embrace this nonlocality, modern variations of the Everett-inspired many worlds interpretation try to circumvent it. In this paper, we review Bell's "no-go" theorem and explain how it rests on three axioms, local causality, no superdeterminism, and one world. Although Bell is often taken to have shown that local causality is ruled out by the experimentally confirmed entanglement correlations, we make clear that it is the conjunction of the three axioms that is ruled out by these correlations.

We then show that by assuming local causality and no superdeterminism, we can give a direct proof of many worlds. The remainder of the paper searches for a consistent, local, formulation of many worlds.
I accept those assumption. Local causality is axiomatic for all of science. Only Gerard 't Hooft and Dr. Bee believe in superdeterinism.

From this he claims to prove many worlds!! No, this is crazy. No set of assumptions can prove the existence of unobservable parallel worlds.

The root of his error is that he has a hidden assumption in favor of hidden variable theories. Such theories have been discarded for a century.

Gizmodo reports:
Scientists studying kea, New Zealand’s alpine parrot, revealed that the famously mischievous birds could understand probabilities, an impressive mental feat.

The pair of researchers put six birds through a series of trials to see how they made decisions when faced with uncertainty. When prompted to choose, the kea generally opted for scenarios where they were more likely to earn a reward. This work is further evidence of some birds’ general intelligence, according to the paper published in Nature Communications.
These parrots must be smarter than the many-worlds advocates.

The above paper admits:
On a branching model, it is difficult to make sense of the probabilistic predictions of quantum mechanics. Pre-measurement, Alice might assert "there is a 0.7 probability that I will see spin-up (rather than spin-down)". But given branching, it seems that Alice should assign probability 1 to there being an Alice descendant that sees spin-up. Pre-measurement Alice is not uncertain of anything; she knows she will branch into descendants that see contrary results in different worlds, and she knows that neither of her descendants is the "real" Alice -- they both have equal claim to being pre-measurement Alice. It is therefore unclear what the "probability 0.7" is a probability of. This aspect of the problem is often referred to as the incoherence problem.
This problem has no solution. If you believe in many-worlds, you have to abandon probabilities.

You might think that the probabilities could be interpreted as the possibility of a particular branching. That is, one possible parallel world has probability 0.7, and the another has 0.3. However the many-worlds advocates have never been able to make such a theory work.

Tuesday, March 3, 2020

Dr. Bee attacks panpsychism

Sabine Hossenfelder writes in Nautilus:
I recently discovered panpsychism. ...

Can elementary particles be conscious? No, they can’t. It’s in conflict with evidence. Here’s why.

We know 25 elementary particles. These are collected in the standard model of particle physics. The predictions of the standard model agree with experiment to best precision.

The particles in the standard model are classified by their properties, which are collectively called “quantum numbers.” The electron, for example, has an electric charge of -1 and it can have a spin of +1/2 or -1/2. There are a few other quantum numbers with complicated names, such as the weak hypercharge, but really it’s not so important. Point is, there are handful of those quantum numbers and they uniquely identify an elementary particle.
She is responding to some pro-panpsychism articles in the same online magazine, such as here.

No, she is not correct. An electron is not determined by its quantum numbers. It also has position and momentum. More importantly, it also has whatever info its wave function attempts to capture.

There is some debate about whether the wave function fully characterizes everything about the electron. Dr. Bee believes in superdeterminism, and under that theory, the answer is certainly not. Superdeterminism requires that electron behavior is predicted by long-ago events that are not captured by the wave function.

Entanglement imposes another problem for saying the electron wave function is everything. The combination of these views lead to several paradoxes.
With the third option it is indeed possible to add internal states to elementary particles. But if your goal is to give consciousness to those particles so that we can inherit it from them, strongly bound composites do not help you. They do not help you exactly because you have hidden this consciousness so that it needs a lot of energy to access. This then means, of course, that you cannot use it at lower energies, like the ones typical for soft and wet thinking apparatuses like human brains.

Summary: If a philosopher starts speaking about elementary particles, run.
She lost me here. It seems to me that an electron could have an internal state with a tiny bit of consciousness. I know this sounds goofy, but I don't think she disproved it. She believes in superdeterminism, and that is much goofier.

Monday, February 17, 2020

Randomness cannot be empirically shown

A new paper argues:
We consider the nature of quantum randomness and how one might have empirical evidence for it. We will see why, depending on one's computational resources, it may be impossible to determine whether a particular notion of randomness properly characterizes one's empirical data. Indeed, we will see why even an ideal observer under ideal epistemic conditions may never have any empirical evidence whatsoever for believing that the results of one's quantum-mechanical experiments are randomly determined. This illustrates a radical sort of empirical underdetermination faced by fundamentally stochastic theories like quantum mechanics.
Isn't this obvious?

A lot of people say that quantum mechanics shows that the world is intrinsically random, or objectively random, or some such nonsense. There is no empirical support for such statements. For one thing, there could be a superdeterminism that makes nothing random.

We say that coin tosses are random, because nobody goes to the trouble of tracking all the variables needed to predict the outcome.

We say radioactive decay is random, because there is no known way of predicting the precise decay time. But it seems possible that we could, if we knew more about about the state of nucleus in question.

The paper discusses tests for coin toss sequences to appear random, but we have no way of recognizing intrinsic randomness even if we saw it.

Wednesday, February 12, 2020

More millions for quantum BS

The NY Times reports:
SAN FRANCISCO — White House officials on Monday unveiled plans to increase federal funding for the development of artificial intelligence and quantum computing, two cutting-edge technologies that defense officials say will play a key role in national security.

The funding, part of the Trump administration’s $4.8 trillion budget proposal, would direct more money for A.I. research to the Defense Department and the National Science Foundation. The administration also wants to spend $25 million on what it calls a national “quantum internet,” a network of machines designed to make it much harder to intercept digital communication.

For several years, technologists have urged the Trump administration to back research on artificial intelligence — which could affect things as diverse as weapons and transportation — and quantum computing, a new way to build super-powerful computers. China’s government, in particular, has made building these machines a priority, and some national security experts worry that the United States is at risk of falling behind.

The proposed spending follows earlier administration moves. In 2018, President Trump signed a law that earmarked $1.2 billion for quantum research. The Energy Department recently began distributing its portion of that money — about $625 million — to research labs in industry, academia and government.

“The dollars we have put into quantum information science have increased by about fivefold over the last three years,” said Paul Dabbar, under secretary for science at the Energy Department, in an interview.
I actually wish that this were legitimate. It would be an exciting area of cryptologic research, and open up a whole new arena for security analysis.

But it is all bogus. There is no practical value to a quantum internet.

Monday, February 10, 2020

Rovelli rejects Eternalism

Physicist Carlo Rovelli just wrote an essay on the philosophy of time, favoring Neither Presentism nor Eternalism. He relies heavily on "Einstein’s conventional definition of simultaneity", without mentioning that the notion is entirely due to Poincare, years before Einstein.

Also:
Shortly after the formulation of special relativity, Einstein's former math professor Minkowski found an elegant reformulation of special relativity in terms of the four dimensional geometry that we call today Minkowski space. Einstein at first rejected the idea. (`A pointless mathematical complication'.) But he soon changed his mind and embraced it full heart, making it the starting point of general relativity, where Minkowski space is understood as the local approximation to a four-dimensional, pseudo-Riemannian manifold, representing physical spacetime.

The mathematics of Minkowski and general relativity suggested an alternative to Presentism: the entire four-dimensional spacetime is `equally real now', and becoming is illusory. This I call here Eternalism.
This is cleverly written to convince you that Minkowski derived a 4D geometry version of relativity from Einstein's work. This is not true.

Poincare was the first to formulate a 4D geometry version of relativity, and that paper was written before Einstein published anything on the subject. Minkowski's 4D space was developed directly from Poincare's work, not Einstein. Minkowski does cite Einstein's paper, but does not use anything from it, and it is not clear that Einstein had any influence on Minkowski at all. From Poincare's paper, Minkowski gets the 4D formalism, the pseudo-Riemannian metric, the 4D Lorentz transformations, and the 4D covariance of Maxwell's equations.
This subtle mistake of McTaggart is the same mistake as that which lies at the root of Eternalism. The ensemble of the events of the world is four-dimensional, and we can embrace it within a single image. But this is not a denial of becoming, no more than a single chart of the British royal dynasties is a denial of the fact that events happened in England along the centuries.
Rovelli is right that believing in relativity and using Minkowski does not a belief that all times exist at once. Some people seem to believe that relativity requires determination and a denial of the present. One can still have different philosophical views of time.

Tuesday, January 28, 2020

The truth matters

SciAm posts this opnion:
Scientists Must Stand Up for Internationalism ...

Scientists should therefore be extremely concerned that the burgeoning national populist attacks on globalism will ultimately impair scientific progress—if they have not already. Funding for international scientific projects could wither, and foreign scientists may become unwelcome in nations other than their own, restraining information exchanges. We might even witness a reversion to the scientific fragmentation of the 1930s, when some eminent German physicists championed “Deutsche Physik” as superior to that of other nations.

To his great credit, Albert Einstein flatly rejected such nationalistic rhetoric. During World War I, he refused to add his name to the Manifesto of the 93 German Intellectuals, which touted German cultural superiority; instead, he embraced internationalism. When the Nazi regime of Adolf Hitler came to power in 1933, he was in the United States, and soon decided not to return to Germany.
This is a simpler explanation -- Einstein did not consider himself a German.

Einstein lived in Italy for a while as a teenager. He was educated in the French part of Switzerland. He was working in Switzerland when he published his famous papers. His first wife was Serbian. He was a Zionist all his life.
As California Representative Adam Schiff said before the U.S. Senate on January 24, "The truth matters." ...

In doing so, scientists can help lead the world back to the rational, rules-based order that has characterized diplomatic relations for decades.
Whoa! Is he trying to say we should impeach and convict President Trump, and let Deep State hawks revive the Cold War?

SciAm has always had a left-wing political slant, but this is bizarre. Citing Einstein on nationalism and Schiff on truth? Just what is that "rational, rules-based order"?