Monday, February 28, 2022

Quantum Computers cannot break Bitcoin

Bruce Schneier reports:
Researchers have calculated the quantum computer size necessary to break 256-bit elliptic curve public-key cryptography:

Finally, we calculate the number of physical qubits required to break the 256-bit elliptic curve encryption of keys in the Bitcoin network within the small available time frame in which it would actually pose a threat to do so. It would require 317 × 106 physical qubits to break the encryption within one hour using the surface code, a code cycle time of 1 μs, a reaction time of 10 μs, and a physical gate error of 10-3. To instead break the encryption within one day, it would require 13 × 106 physical qubits.

In other words: no time soon. Not even remotely soon. IBM’s largest ever superconducting quantum computer is 127 physical qubits.

That IBM devide doesn't really have even a single qubit. It just has a quantum experiment that can be simulated with 127 qubits.

It appears that Bitcoin will be safe for centuries, from these attacks. A more likely outcome is that Bitcoin will be banned as a waste of electricity, and because its main function is to facillitate extortion, contraband, and money laundering.

Now that Russia is being bloccked from SWIFT international banking, I wonder if it will sell oil for bitcoin. That could increase demand for bitcoin.

People in the USA and Canada have also been cut off from banking services for political reasons. The bitcoin advocates would presumably say that this underscores the need for a nonpolitical currency.

The International Congress of Mathematicians was planning its big once-every-four-years meeting in St. Petersberg this summer. It is now boycotting Russia and holding the meeting online. This is an unfortunate politization of Mathematics. St. Petersburg is a long way from Ukraine. There were previous meetings in Peking and Moscow, in spite of the Communist governments.

The current Nature magazine podcast:

Almost everything we do on the Internet is made possible by cryptographic algorithms, which scramble our data to protect our privacy. However, this privacy could be under threat. If quantum computers reach their potential these machines could crack current encryption systems — leaving our online data vulnerable.


To limit the damage of this so called 'Q-day', researchers are racing to develop new cryptographic systems, capable of withstanding a quantum attack.


This is an audio version of our feature: The race to save the Internet from quantum hackers

It says:
Researchers estimate that to break cryptosystems, quantum computers will need to have in the order of 1,000 times more computing components (qubits) than they currently do.
It will require a million times more.

Update: Here is the Twitter account of the Russians trying to hold a math conference. They are not supporting the Ukraine invasion, and just want a great math conference. It is too bad that the Russia haters are destroying it.

Thursday, February 24, 2022

Giant Plan to Racialize Science Publishing

Nature magazine reports:
The giant plan to track diversity in research journals

Efforts to chart and reduce bias in scholarly publishing will ask authors, reviewers and editors to disclose their race or ethnicity.

In the next year, researchers should expect to face a sensitive set of questions whenever they send their papers to journals, and when they review or edit manuscripts. More than 50 publishers representing over 15,000 journals globally are preparing to ask scientists about their race or ethnicity — as well as their gender — in an initiative that’s part of a growing effort to analyse researcher diversity around the world.

Nothing good will come of this. Scientific productivity has probably already peaked, and may never return to the glories of the XX century.

Tuesday, February 22, 2022

Argument that Science Requires Faith

An essay argues:
The heavens declare the glory of God ...

All of this is to say that, not only is there no inherent conflict between science and Christianity, but the Christian worldview actually motivates and supports the scientific enterprise.

Atheist-evolutionist Jerry Coyne disagrees, of course.

Instead of addressing the theology, look at this argument:

Some believe that science is a superior alternative to faith. But if we peer a little deeper, we see that the scientific method actually requires a great deal of faith before it can even get off the ground. For example, here are five axioms that every scientist (often unconsciously) believes:

The entire physical universe obeys certain laws and these laws do not change with time.
Our observations provide accurate information about reality.
The laws of logic yield truth.
The human mind recognizes the laws of logic and can apply them correctly.
Truth ought to be pursued.

None of these can be proved by science; they must be assumed in order to do any science at all. They are articles of faith.

One could say that they are not really articles of faith, because scientists would abandon them if they turned out to be false. Okay, fine. 

I have some more axioms. Scientists assume:

* The world is real, and not a simulation. There are some scholars who have proposed the simulation hypothesis, and maybe some of them believe in it. But productive scientists do not go for this nonsense.

* Logical reasoning. If you discover some physical truth, then the mathematical and logical consequences are also truths.

* Causality. Events depend on the past light cone, and nothing else.

* Free will. Scientists have the freedom to design experiments that test hypotheses.

* No superdeterminism or many-worlds or any of these other theories that are so contrary to science.

Maybe scientists ought to be more explicit about these axioms. More and more I see scientists casually reject one of them, without acknowledging the disastrous consequences.

Friday, February 18, 2022

Many-Worlds cannot Explain the Double-Slit

Sean M. Carroll has posted February 2022 Ask Me Anything episode of Mindscape.

He is good at explaining physics, but he has enough goofy opinions as to make all his judgments questionable.

He blames Republicans for being anti-democracy. I guess he disagrees with showing ID to vote, but he did not explain.

He has typical knee-jerk liberal opinions.

He says that special relativity requires that there be no preferred frame of reference for time, although he seems to know that the cosmic microwave background radiation suplies one.

He believes in eternalism, and that he has no free will.

He admits that string theory is not falsifiable, but defends it anyway, because today's philosophers have declared Popper obsolute. He says string theory provides some high-energy thought experiments, and we would hate to discard it just because it is pseudo-science.

The most revealing question was about how Many-Worlds theory explains the double-slit experiment. Some many-worlds advocates would say that every time a particle reaches the double-slit, the universe splits into two, with a particle going thru each slit in each world. The interference pattern we see is the result of interference between the worlds.

He does not accept this, and prefers to say that the beam is a wave, and so gives an interference pattern.

One does not need quantum mechanics or many-worlds to give that explanation. Sure, all waves give interference patterns, in a setup where the wave interferes with itself.

This admission shows how terrible many-worlds is. If many-worlds cannot explain the double-slit, then it cannot explain anything. The double-slit is the most elementary example of quantum mechanics. It was Feynman's favorite.

It is true that many-worlds cannot explain the double-slit, or any other experiment. There is no way to count the splittings, assign probabilities, and calculate the interferences. It is all silly science fiction.

Monday, February 14, 2022

Top Science Official is Fired

Pres. Joe Biden ran on trusting the science, and appointed the first cabinet-level science advisor. Apparently Biden neglected to have him castrated first.

SciAm reports:

On February 7, Eric Lander, White House science adviser and director of the Office of Science and Technology Policy (OSTP) resigned in the wake of an internal investigation.* That investigation into Lander’s management of OSTP found “credible evidence” that he had bullied and mistreated staff. Lander’s own statements and letter of resignation verified these findings.

Lander had to resign—there was no way the Biden administration could allow him to stay while abiding by their stated zero-tolerance principles—but the story shouldn’t end there. ...

Many groups, including 500 Women Scientists, posed serious questions about Lander’s management record before he was appointed to OSTP and named science advisor.

No, forcing him to apologize for some unspecified rude comments does not prove his guilt. It only shows the futility of apologizing to today's woke vultures.

This is so bizarre. He is not accused of sexually harassing women. But 500 women do not like his management style. Are these women experts on management? Why does it matter that women are posing questions?

Politico accuses him of financial conflicts:

Under the White House’s ethics agreement Lander signed, he had 90 days to divest his stocks after he was confirmed by the Senate on May 28. While Lander shed the bulk of that stock in June — including shares of BioNTech SE, the German biotechnology company and Pfizer’s Covid-19 vaccine partner — he waited until Aug. 5 to sell the remaining $500,000 to $1 million worth of stock he held in that company. When Lander ultimately sold the stock 69 days after his confirmation, ...

Lander, who is the richest man in Biden’s cabinet with over $45 million in assets when he was nominated,

Wow, he has been working for universities and govt science labs all his life, and he is worth $45M!

The law said he had 90 days to divest. He sold most immediately, and the rest after 69 days. This seems like compliance with the law to me.

If he is smart enough to make a fortune for himself, and apply himself to govt service, we should be happy. I worry more about the conflicts of those who were never able to save any money. If he is worth $45M, then obviously he was not taking a govt job for personal profit.

Nature magazine reports:

“Eric Lander is a successful researcher, but everyone knows that he is a bully,” says Kenneth Bernard, an epidemiologist and biodefence researcher who has worked for the US government under several presidential administrations. “He is widely known as arrogant and controlling.” ...

“I am hoping they push women, and especially women of colour, to the top of the list,” says Emily Pinckney, the executive director of 500 Women Scientists

So a bunch of woke women want to replace him with a Black woman.
Lander was in charge of Biden’s Cancer Moonshot initiative, a revival of the Obama administration’s effort to reduce rates of death from cancer, and he was leading efforts to create an Advanced Research Projects Agency for Health, a high-risk, high-reward funding agency to push for biomedical breakthroughs. He was also in charge of the search for a new director of the National Institutes of Health, following the retirement of Francis Collins last year.
In the original moonshot, we hired ex-Nazis, if they had the necessary expertise to get the job done.

Now, appeasing the feelings of oversensitive women, and filling diversity quotas, is much more important than getting anything done. Lander was mostly taken out by some Biden administration woman lawyer who complained that he was rude to her. This shows screwed up priorities, if the snowflake lawyer's hurt feelings matter more than a scientist getting something done.

I previously mentioned gripes about him, including that "Eric Lander is an evil genius at the height of his craft." I went to school with him 46 years ago. I am sure I do not agree with him politically. He should have gone to work for the Trump administration, where they might have let him do some good.

This is not good for science. They created a high-status position just so a science leader would have some clout, and yet he still gets taken down by his personal enemies.

Thursday, February 10, 2022

Philosopher says Free Will cannot be Random

Massimo Pigliucci is a biologist-turned-philosopher who pretends to be an expert on what is scientific. He writes:
“Free” will, understood as a will that is independent of causality, does not exist. And it does not exist, contra popular misperception, not because we live in a deterministic universe. Indeed, my understanding is that physicists still haven’t definitively settled whether we do or not. Free will doesn’t exist because it is an incoherent concept, at least in a universe governed by natural law and where there is no room for miracles. ...

Philosophically naive anti-free will enthusiasts like Sam Harris and Jerry Coyne, among others, eventually started using the Libet experiments as scientific proof that free will is an illusion. But since free will is incoherent, as I’ve argued before, we need no experiment to establish that it doesn’t exist. What Libet’s findings seemed to indicate, rather, is the surprising fact that volition doesn’t require consciousness.

Coyne takes offense at this, as Pigliucci provides no link to the supposedly naive opinion, and has retracted the word "naive".

Pigliucci is the naive one here, and he somehow gets to neuroscience conclusions by saying no experiment is needed. He also has a history of blocking comments that disagree with him, on the grounds that they are rude. So it is amusing to see him make much ruder comments about Coyne and Harris.

Libet-style experiments have been criticized by both philosophers and neuroscientists on a variety of conceptual and methodological grounds, but until recently nobody had empirically addressed the obvious flaw with the whole approach
Actually, the obvious flaws have been noticed in published papers in 2009, 2012, and 2013, as noted on my blog. Pigliucci discusses a 2019 paper, but that paper cites those earlier papers.

Thus it has been established for ten years that Libet experiments tell us nothing about free will.

The real problem with Pigliucci's essay is his pseudoscientific argument that free will can rejected from first principles. It is pseudoscience because he pretends to rely on scientific knowledge, but rejects any experiment that might prove him right or wrong.

Astrology is likewise a pseudoscience because it uses science to track the stars and planets, but never uses experiments to test the accuracy of its predictions.

Coyne argues:

Free will is not a non-issue, and we know that because many people accept it. For them it is an issue! They accept it because they don’t understand physics, because they embrace duality, or because they believe in God and miracles. You can’t dismiss all those people, for they are the ones who make and enforce laws and punishments based on their misunderstanding that we have libertarian free will. They are the ones who put people to death because, they think, those criminals could have chosen not to pull the trigger.
Not just those believe in free will. Pretty much everyone who has accomplished anything in the last 500 years has believed in free will.

Pigliucci's argument against free will is to follow the philosophical fallacy of dividing into two straw man cases:

Consider two possibilities: either we live in a deterministic cosmos where cause and effect are universal, or randomness (of the quantum type) is fundamental and the appearance of macroscopic causality results from some sort of (not at all well understood) emergent phenomena. If we live in a deterministic universe then every action that we initiate is the result of a combination of external (i.e., environmental) and internal (i.e., neurobiological) causes. No “free” will available.

If we live in a fundamentally random universe then at some level our actions are indeterminate, but still not “free,” because that indetermination itself is still the result of the laws of physics. At most, such actions are random.

Either way, no free will.

Either way, he is just asserting that the laws of physics prohibit free will, and he ignores any empirical science as irrelevant.

This is no better than the Pope announcing some theological belief based on meditating about the Bible. His flaw is that he misunderstands the concept of "random".

I explained in 2014:

A stochastic process is just one that is parameterized by some measure space whose time evolution is not being modeled.

Unless you are modeling my urges for cheeseburgers, then my appetite is a stochastic process. By definition. Saying that it is stochastic does not rule out the idea that I am intentionally choosing that burger.

Certain quantum mechanical experiments, like radioactive decay or Stern–Gerlach experiment, are stochastic processes according to state-of-the-art quantum mechanics. That just means that we can predict certain statistical outcomes, but not every event. Whether these systems are truly deterministic, we do not know, and it is not clear that such determinism is really a scientific question.

Pigliucci is an example of what a disaster modern philosophy is. About 60 years ago, philosophers abandoned the idea that science can tell us about reality. The scientific method depends on the free will to choose experiments, and if that does not exist, then all of science is bogus.

Free will is the one thing conscious being can be most sure about. Not having free will is a symptom of schizophrenia.

Denying free will is just one modern idea that is contrary to science. Others I have discussed here are Kuhnian paradigm shift theory, superdeterminism, many-worlds theory, and the simulation hypothesis. Belief in any of these things negates all of science as we know it.

I should note this comment:

Massimo’s argument ... seems to be stating a tautology: Libertarian free will is defined by independence from natural law, therefore it can’t apply in a universe where everything happens in accord with natural law. Absolutely true, and absolutely not news!
In a sense, this is right, Pigliucci used another stupid philosopher fallacy. He finds that free will does not exist by giving a nonsensical definition of it. Just to be clear, I believe in free will, and I believe it is consistent with natural law. Truth does not contradict truth.

This is not unusual either. Most people believe in free will and natural law.

Monday, February 7, 2022

Comparing Quantum Hype to Other Hypes

New paper:
Mitigating the quantum hype
Olivier Ezratty

We are in the midst of quantum hype with some excessive claims of quantum computing potential, many vendors’ and even some research organizations’ exaggerations, and a funding frenzy for very low technology readiness level startups. Governments are contributing to this hype with their large quantum initiatives and their technology sovereignty aspirations. Technology hypes are not bad per se since they create emulation, drive innovations and also contribute to attracting new talents. ...

Artificial intelligence specialists who have been through its last “winter” in the late 1980s and early 1990s keep saying that quantum computing, if not quantum technologies on a broader scale, are bound for the same fate: a drastic cut in public research spendings and innovation funding. Their assumption is based on observing quantum technology vendors and even researchers overhype, on a series of oversold and unkept promises in quantum computing and on the perceived slow improvement pace of the domain. ...

We have seen that the quantum hype has many differentiated aspects compared to past and current technology hypes, the main ones being its technology diversity and the complexity of evaluating its scientific advancements and roadmaps.

The paper has a nice comparison to other technology hype.

It cannot tell us whether the hype is justified, long term. It can only look at examples of hypes.

I am still not sure about the AI hype. It has never lived up to its expectations. It has had some huge successes.

It is still unknown whether quantum computers will have any utility.

Wednesday, February 2, 2022

Cannot Count the Many-Worlds Branches

Here is a typical paper trying to make sense out of many-sorlds:
But Everett was less than clear in two respects. First, the quantum state can equally be written as a superposition of any set of basis states; what reason is there to single out his ‘branch states’ as distinguished? This is ‘the preferred basis problem’. Second, Everett assumed that a probability measure over branches is a function of branch amplitude1 and phase, and from this derived the Born rule (the standard probability rule for quantum mechanics). But there is a natural alternative probability measure suggested exactly by his picture of branching, that is not of this form and that does not in general agree with the Born rule: the ‘branch-counting rule’. Let the world split into two, each with a different amplitude: in what sense can one branch be more probable than the other? If Everett is to be believed, both come into existence with certainty, regardless of their amplitudes. After many splittings of this kind, divide the number of branches with one outcome, by the number of branches with another; that should give their relative probability. This is the branch-counting rule, in general in contradiction with the Born rule.

Everett did not reply to either criticism, having left the field even before the publication, in Reviews of Modern Physics in 1957, of his doctoral thesis, all of eight pages in length; he never wrote on quantum mechanics again.

The paper tries hard to count the branches and get probabilities, but there is no way to do it.

You can pretend that probabilities come from the Born Rule, but that doesn't make any sense either.

The paper cites arguments for plowing ahead with many-worlds anyway. But if the theory cannot make sense out of probabilities, then what good is it? It cannot do anything, and those who promote it are charlatans.

Scott Aaronson, a recent convert to many-worlds, says:

These days, my personal sense is that Many Worlds is the orthodox position … it’s just that not all of its adherents are willing to come out and say so in so many words! Instead they talk about decoherence theory, the Church of the Larger Hilbert Space, etc. — they just refrain from pointing out the Everettian corollary of all this, and change the subject if someone else tries to. 🙂
This says a lot about the sorry state of modern Physics. That a level-headed guy like Aaronson would believe in such a bizarre fairy tale. And apparently many of the leaders of this cult are unwilling to publicly admit it.

He gave this answer after 500 comments:
There are two branches, after all. What does it mean to have one branch be more probable than another?

I’d say that it simply means: if someone asks you to bet on which branch you’ll find yourself in before the branching happens, then you should accept all and only those bets that would make sense if the probabilities were indeed 1/3 and 2/3, or whatever else the Born rule says they are.

This is no answer. He is saying to follow the Copenhagen interpretation to determine probabilities, and then to bet on those probabilities in a many-worlds theory, even though many-worlds can say nothing about the probability of those worlds.

This means many-worlds is nothing more than taking a theory that predicts probabilities, and pretending that false outcomes live as separate realities.

See also previously posted comments, about that Aaronson post.

Another commenter explains:

Can someone who is a many-worlds person, or Everettian if you prefer, explain to me why the many-world ontology is so appearly or evident to you? I am looking for someone who is a die-hard, every branch is equally-real many-worldser.

Was there one moment where it all clicked for you? Do you believe that there is an uncountable infinity of other branches of the wavefunction that are equally real, equally extant?

Do I just lack imagination, or do I just not get it?

For me it was when I realized that Many-Worlds is what you get when you just take what the Schrödinger equation says as literally true, and stop torturing it with an unphysical and ill-defined collapse. It got reinforced when I was taking a lecture on QFT and realized that the high-energy people simply ignore collapse, for them the theory is completely unitary. Obvious in retrospect: for them relativistic effects are crucial, and how could they ever reconcile that with a nonlocal collapse? ... And yes, all branches are real. There’s nothing in the math to differentiate them.
In other words, he just doesn't want to accept that an observation tells you what is real.

This is like saying: When I toss a coin, theory says that heads and tails each have probability 0.5. I realized that Many-Worlds is what you get when you take both possibilities as literally true, and stop artificially ruling out what does not happen.

In other words, you have a theory that a coin toss gives heads with probability 1/2. That is, heads occurs half the time. The many-worlds guy comes along and says you are torturing the model by excluding the tails that do not happen. So now you say all the possibilities occur all the time, just in parallel worlds. We cannot say that heads occurs half the time, because it occurs all the time. Maybe we could say it occurs in half the worlds, but no one has been able to make mathematical sense out of that. So Scott says the probability is still 1/2, because he would bet on heads that way.

This is stupid circular reasoning. He only bets on coin tosses because of the perfectly good theory that he discards. One he adopts many-worlds, the bet on heads wins in one branch, and loses in the other.

In comment #618, Aaronson addresses the fact that many-worlds is just conjecturing that all possibilities happen, with no dependence on quantum mechanics:

I, in return, am happy to cede to you the main point that you cared about: namely, that in both the Everettverse and the Classical Probabiliverse, the probabilities of branches could properly be called “objective.” ...

Here’s something to ponder, though: your position seems to commit you to the view that, even if we only ever saw classical randomness in fundamental physics, and never quantum interference, a Probabiliverse (with all possible outcomes realized) would be the only philosophically acceptable way to make sense of it.

But given how hard it is to convince many people of MWI in this world, which does have quantum interference, can you even imagine how hard it would be to convince them if we lived in a Classical Probabiliverse?

In fact, if anti-Everettians understood this about your position, they could use it to their advantage. They could say: “Everettians, like Deutsch, are always claiming that quantum interference, as in the double-slit experiment, forces us to accept the reality of a muliverse. But now here’s Mateus, saying that even without interference, a multiverse would still be the only way to make sense of randomness in the fundamental laws! So it seems the whole interference argument was a giant red herring…” 🙂

Yes, the whole interference argument is a red herring. Many-Worlds is what you get when you deny probabilities, and assume anything can happen. It has nothing to do with quantum foundations.

Comment #637 argues:

Free will is kind of a red herring, as free will is not linked to determinism or non-determinism. A slight majority of philosophers accept this (59%? see Compatibilism), but the gist of the argument is thus:
He then goes on the argue that free will is impossible in naturalist theory, but philosophers have rationalized this by saying we can have a feeling of free will in a determinist world.

This is an argument that only a learned philosopher would make, as it doesn't make any sense. Any normal person would say that if the theory cannot allow free will, then the theory is deficient.

In discussion about the essence of quantum mechanics, Aaronson argues:

If superposition is the defining feature of QM, then interference of amplitudes is the defining feature of superposition. It’s not some technical detail. It’s the only way we know that superpositions are really there in the first place, that they aren’t just probability distributions, reflecting our own ordinary classical ignorance about what’s going on.
He got this reply:
I’m afraid you’re getting things backwards; interference is an artefact of the substance of matter being made of waves instead of particles. A wave goes through two slits at once and interferes with itself – that’s not at all interesting beyond the question of why matter is made of waves instead of particles, which is more along the lines of the nuts-bolts question. You think it matters because you assume the particle view and particle interactions are ontically prior – but that phenomena is well explained by decoherence.
Aaronson responds:
A billion times no! To describe quantum interference as merely about “matter being made of waves and not particles” is one of the worst rhetorical misdirections in the subject’s history (and there’s stiff competition!). It suggests some tame little ripples moving around in 3-dimensional space, going through the two slits and interfering, etc. But that’s not how it is at all.

If we have a thousand particles in an entangled state, suddenly we need at least 21000 parameters to describe the “ripples” that they form. How so? Because these aren’t ripples of matter of all; they’re ripples of probability amplitude. And they don’t live in 3-dimensional space; they live in Hilbert space.

In other words, what quantum interference changes is not merely the nature of matter, but much more broadly, the rules of probability. That change to the rules of probability is quantum mechanics. The changes to the nature of matter are all special byproducts—alongside the changes to the nature of light, communication, computation, and more.

The great contribution of quantum computation to the quantum foundations debate is simply that, at long last, it forced everyone to face this enormity, to stop acting like they could ignore it by focusing on special examples like a single particle in a potential and pretending the rest of QM didn’t exist. Indeed, by now QC’s success in this has been so complete that it’s disconcerting to encounter someone who still talks about QM in the old way, the way with a foot still in classical intuitions … so that a change to the whole probability calculus underlying everything that could possibly happen gets rounded down to “matter acting like a wave.”

He is saying that the quantum supremacy of quantum computers has proved that quantum systems have a complexity that is far beyond what you could expect by saying that QM is a wave theory of matter.

I have come to the conclusion that superposition and double-slit experiments are not so mysterious, once you accept that light and matter are waves. Then superposition is just what you expect. Aaronson says that there is a hidden complexity that allows factoring of numbers with hundreds of digits.

I might think differently if we have a convincing demonstration of quantum supremacy. We do not. We are decades away from factoring those numbers. It may never happen. Aaronson himself has retracted his quantum supremacy claims.

So there is a question. In QM, is matter just a wave, or is it a supercomputer in disguise? I will believe it is a supercomputer when it does a super-computation.