Pages

Friday, September 30, 2016

Colonizing the universe

Retired mathematical physicist Freeman Dyson writes:
When humans begin populating the universe with Noah’s Ark seeds, our destiny changes. We are no longer an ordinary group of short-lived individuals struggling to preserve life on a single planet. We are then the midwives who bring life to birth on millions of worlds. We are stewards of life on a grander scale, and our destiny is to be creators of a living universe. We may or may not be sharing this destiny with other midwife species in other parts of the universe. The universe is big enough to find room for all of us.
Some ppl are ready to go:
Elon Musk is preparing to reveal further details of his hugely ambitious plan to build a city on the surface of Mars.

Tomorrow, the billionaire’s SpaceX is holding an event called “Making Humans An Interplanetary Species” which will shed light on the Red Planet exploration scheme.

Ahead of the event, Musk shared images of a new rocket booster called the Raptor, which will power an “Interplanetary Transport System”.

His firm has carried out a test-firing of the device, which is designed to propel a spaceship all the way to the Red Planet.
But even Dyson does not believe in quantum gravity.

Cosmologist Lawrence M. Krauss wrote last year:
How can we tell if gravity is quantum?

Freeman Dyson, who is a brilliant physicist and a contrarian, he had pointed out based on some research — he’s 90 years old, but he had done some research over the years — I was in a meeting in Singapore with him when he pointed out that we really don’t know if gravity is a quantum theory. Electromagnetism is a quantum theory because we know there are quanta of electromagnetism called photons. Right now they’re coming, shining in my face and they’re going into the camera that’s being used to record this and we can measure photons. There are quanta associated with all of the forces of nature. If gravity is a quantum theory, then there must be quanta that are exchanged, that convey the gravitational force; we call those gravitons. They’re the quantum version of gravitational waves, the same way photons are the quantum version of electromagnetic waves. But what Freeman pointed out is that there’s no terrestrial experiment that could ever measure a single graviton. He could show that in order to build an experiment that would do that, you’d have to make the experiment so massive that it would actually collapse to form a black hole before you could make the measurement. So he said there’s no way we’re ever going to measure gravitons; there’s no way that we’ll know whether gravity is a quantum theory.

What I realized, and Frank and I codified in our paper, is that actually the universe acts like a graviton detector, in [the] sense that processes in the early universe produced phenomena that could be observed today as gravitational waves. But those events, those processes, will only work if gravity is a quantum theory. If gravity isn’t a quantum theory, we won’t see these gravitational waves from the very early universe, which BICEP thought they saw. Now BICEP may not have seen gravitational waves from the early universe, but the fact that we recognize that if this phenomena called inflation happens in the very early universe, and if it produces gravitational waves, that will tell us that gravity is a quantum theory; therefore, all of the problems of quantum gravity will need to be addressed by theorists, giving job security for generations.
Dyson is right here, and all the talk about quantum gravity is unscientific speculation.

Wednesday, September 28, 2016

Junk science in TED Talk

Jerry Coyne reports on some populist bogus social science:
The second most popular TED talk of all time, with over 32 million views on TED, is by Harvard Business School associate professor Amy Cuddy, called “Your body language shapes who you are”. (You can also see the talk on YouTube, where it has over 10 million views. Cuddy appears to be on “leave of absence.”)  Her point, based on research she did with two others, was that by changing your body language you can modify your hormones, thus not only influencing other people in the way you want, but changing your own physiology in a way you want.
The kindest thing you can say is that her results were not replicated. All the other studies gave opposite conclusions.

Statistician Andrew Gelman also laments the sorry state of social science. It is overrun by poor studies and p-hacking.

Monday, September 26, 2016

No-Cloning is not fundamental

Complexity theorist Scott Aaronson makes a big deal out of the No-cloning theorem:
The subject of this talk is a deep theorem that stands as one of the crowning achievements of our field. I refer, of course, to the No-Cloning Theorem. Almost everything we’re talking about at this conference, from QKD onwards, is based in some way on quantum states being unclonable. ...

OK, but No-Cloning feels really fundamental. One of my early memories is when I was 5 years old or so, and utterly transfixed by my dad’s home fax machine, ...

The No-Cloning Theorem represents nothing less than a partial return to the view of the world that I had before I was five. It says that quantum information doesn’t want to be free: it wants to be private. There is, it turns out, a kind of information that’s tied to a particular place, or set of places. It can be moved around, or even teleported, but it can’t be copied the way a fax machine copies bits.

So I think it’s worth at least entertaining the possibility that we don’t have No-Cloning because of quantum mechanics; we have quantum mechanics because of No-Cloning — or because quantum mechanics is the simplest, most elegant theory that has unclonability as a core principle.
No-Cloning sounds fundamental, but it is not.

The theorem says that there is no physical process that always perfectly duplicates a given physical entity. You can have a laser beam that emits identical photons in identical states, but you can not have some trick optical device that takes in any photon and copies it into an additional photon in the same state.

The proof assumes that any such process would have to perfectly represent the entity (eg photon) by a wave function, to apply a unitary transformation to duplicate it, and to perfectly realize that result.

So you have to believe that a physical entity is exactly the same as a mathematical wave function, and a physical process is exactly the same as a unitary transformation.

But isn't that the essence of quantum mechanics? No, it is not.

Quantum mechanics is a system of using wave functions to predict observables. But there is no good reason to believe that there is a one-to-one correspondence between physical entities and wave functions.

The no-cloning theorem is a big headache for quantum computing because it means that qubits can never be copied. Ordinary computers spend most of their time copying bits.

But scalable qubits may not even be possible.

The no-cloning theorem is not something with any direct empirical evidence. It is true that we have no known way of duplicating an atomic state, but that is a little misleading. We have no way of doing any observations at all on an atomic state without altering that state.

Friday, September 23, 2016

China claims to invent quantum radar

ExtremeTech reports:
The Chinese military says it has invented quantum radar, a breakthrough which, if true, would render the hundreds of billions of dollars the United States has invested into stealth technology obsolete. Like the original invention of radar, the advent of modern artillery, or radio communications, quantum radar could fundamentally transform the scope and nature of war. ...

Quantum radar would exploit quantum entanglement, the phenomena that occurs when two or more particles are linked, even when separated by a significant amount of physical space. In theory, a radar installation could fire one group of particles towards a target while studying the second group of entangled particles to determine what happened to the first group. The potential advantages of this approach would be enormous, since it would allow for extremely low-energy detection of approaching enemy craft. Unlike conventional radar, which relies on an ability to analyze and detect a sufficiently strong signal return, quantum radar would let us directly observe what happened to a specific group of photons. Since we haven’t invented cloaking devices just yet, this would seem to obviate a great deal of investment in various stealth technologies.

China is claiming to have developed a single-photon radar detection system that can operate at a range of 100km, more than 5x that of a lab-based system developed last year by researchers from the UK, US, and Germany. Research into quantum radar has been ongoing for at least a decade, but there are significant hurdles to be solved.
Let me get this straight. The quantum radar entangles two photons, and fires one at the stealth plane. Then there is no need to look for any return signal because you can figure out what happened by examining the entangled photon that never left.

This is impossible, of course. Either someone has some massive misunderstandings of quantum mechanics, or someone is perpetrating a scam. Or maybe China thinks that Americans are dumb enuf to believe a story like this.

Monday, September 19, 2016

Nobel site on relativity

The Nobel Prize site has a page on the History of Special Relativity
1898 Jules Henri Poincaré said that "... we have no direct intuition about the equality of two time intervals."

1904 Poincaré came very close to special relativity: "... as demanded by the relativity principle the observer cannot know whether he is at rest or in absolute motion."

1905 On June 5, Poincaré finished an article in which he stated that there seems to be a general law of Nature, that it is impossible to demonstrate absolute motion. On June 30, Einstein finished his famous article On the Electrodynamics of Moving Bodies, where he formulated the two postulates of special relativity.
Here is that 1905 Poincare paper. It is really just an abstract of a longer paper. Poincare states that general law of nature in the first paragraph.

The most important points in that Poincare abstract are that the Lorentz transformations form a group, and that all forces, including electromagnetism and gravity, are affected the same way. Poincare is quite emphatic about both of these points.

Einstein had access to Poincare's abstract before writing his
1905 relativity paper, but he does not quite get these two points, altho he is often credited with them. Here is what they say.

Poincare: "The sum of all these transformations, together with the set of all rotations of space, must form a group; but for this to occur, we need l = 1; so one is forced to suppose l = 1 and this is a consequence that Lorentz has obtained by another way."

Einstein: "we see that such parallel transformations — necessarily — form a group."

Poincare: "Lorentz, in the work quoted, found it necessary to complete his hypothesis by assuming that all forces, whatever their origin, are affected by translation in the same way as electromagnetic forces and, consequently, the effect produced on their components by the Lorentz transformation is still defined by equations (4). It was important to examine this hypothesis more closely and in particular to examine what changes it would require us to make on the law of gravitation."

Einstein: "They suggest rather that, as has already been shown to the first order of small quantities, the same laws of electrodynamics and optics will be valid for all frames of reference for which the equations of mechanics hold good."

As you can see, Poincare had a deeper understanding of relativity, and he published it first.

Saturday, September 10, 2016

Randomness does not need new technology

SciAm reports:
Photonic Chip Could Strengthen Smartphone Encryption

The chip uses pulses of laser light to generate truly random numbers, the basis of encryption. Christopher Intagliata reports.

Random numbers are hugely important for modern computing. They're used to encrypt credit card numbers and emails. To inject randomness into online gaming. And to simulate super complex phenomena, like protein folding or nuclear fission.

But here's the dirty secret: a lot of these so-called random numbers are not truly random. They're actually what’s known as "pseudo random numbers," generated by algorithms. Think of generating random numbers by rolling dice. If you know the number of dice, it’s simple to figure out something about the realm of possible random numbers—thus putting probabilistic limits on the randomness.

But truly random numbers can be generated through quantum mechanical processes. So researchers built a photonic chip — a computer chip that uses photons instead of electrons. The chip has two lasers: one shoots continuously; the other pulses at regular intervals. Each time the two lasers meet, the interference between the light beams is random, thanks to the rules of quantum mechanics. The chip then digitizes that random signal, and voila: a quantum random number generator.
This distinction between truly random and pseudo random numbers is entirely fallacious.

They act as if quantum hanky panky magically makes something more random than other random things.

See the Wikipedia article on Hardware random number generators. There are lots of them on the market already. They are already cheap enuf for smart phones, if there were any commercial need.

I actually think that it would be nice to have such a random number function in PCs and phones. But the fact is that there are suitable workarounds already. There is no need for a fancy 2-laser device like in this research.

Massimo Pigliucci is a philosophy professor who specializes in calling things pseudoscience, and argues strenuously that it is a useful term:
Pseudoscience refers to “any body of knowledge that purports to be scientific or to be supported by science but which fails to comply with the scientific method,” though since there is no such thing as the scientific method, I would rather modify the above to read “with currently accepted scientific standards.” ...

Burke then goes on to cite a 2011 dissertation by Paul Lawrie that argues that “dismissing… scientific racism as ‘pseudo-science,’ or a perversion of the scientific method, blurs our understanding of its role as a tool of racial labor control in modern America.”

Maybe it does, maybe it doesn’t. But scientific racism is a pseudoscience, and it is widely recognized — again by the relevant epistemic community — as such.
It seems to me that the term pseudoscience is mainly used for political purposes and unscientific name-calling, such as for denouncing racism.

But if anything is pseudoscience, then why not the creation of "truly random numbers"? Why is it acceptable to distinguish truly random numbers from not-truly random numbers, but only pseudoscience to distinguish Caucasians from Orientals?

Current philosophers like Pigliucci should be called pseudo-philosophers. They sound as if they are doing philosophy, but they fail to follow minimal standards.

Thursday, September 8, 2016

Coyne is not a freely acting agent

Leftist-atheist-evolutionist Jerry Coyne regularly attacks religion and believers for being irrational. He has written books on the subject.

Here he quibbles with physicist Sean M. Carroll over free will:
Lack of information simply means we cannot perfectly predict how we or anybody else will do, but it doesn’t say that what we’ll do isn’t determined in advance by physical laws. Unpredictability does not undermine determinism. And therefore, if you realize that, I don’t know how you can assert that “it’s completely conceivable that we could have acted differently.” ...

What I would have added to this is that nobody who downloads child pornography is in control of his actions, at least in the sense that they could have avoided downloading the pornography. I’m sure Sean would agree. Whether you have a brain tumor, some other cause of hypersexuality, were abused yourself as a child, were mentally ill in a way with no clear physical diagnosis, or simply have been resistant to social pressures to avoid that kind of stuff — all of this is determined by your genes and your environment. ...

But what does predictability have to do with this? We already know that people are not freely acting agents in the sense that they are free from deterministic control by their brains. We already know that people are predestined. ...

I’ve already given my solution to this issue. We recognize that, at bottom, nobody could have done otherwise. If they are accused of something that society deems to be a crime, you find out if they really did commit that crime. If they’re found guilty, then a group of experts — scientists, psychologists, sociologists criminologists, etc. — determine what the “punishment” should be based on the person’s history (a brain tumor would mandate an operation, for instance), malleability to persuasion, likelihood of recidivism, danger to society, and deterrent effects. None of that needs the assumption that someone is a “freely acting agent.”

I know many of you will disagree on that, or on the ramifications of determinism for our punishment and reward system. But at least I have the satisfaction of knowing that Sean agrees with me on this: if science shows the way our behaviors are determined, that knowledge should affect the way we punish and reward people.
So why is he saying what ppl should do if no one has any choice about it?

How can he distinguish between predictability and determinism, if no science experiment can make that distinction? What sense does it make to say something is determined, if there is no conceivable empirical way of making that prediction?

Coyne is pretty good on some evolution-related science that is closer to his expertise, but on subjects like this, I do not see how he is any more scientific than the religious believers he attacks.

Our most fundamental laws of physics are not deterministic. And even if they were, they would not be so predictable as to predict ppl's free choices.

Many theists believe in a God that is all-knowing. Most Christians believe that God gives us free will to make our own choices. Coyne mocks all of this, but he believes in some unobservable force or system that determines everything we do.

I am beginning to think that if a man says that he is driven by voices in his head, I should just believe him. Ditto for demons in his head, or any other equivalent. That is, I choose to believe him.

For more from Carroll, listen to this recent lecture:
Sean Carroll discusses whether space and gravity could emerge from quantum entanglement -- and rewrites physics in terms of quantum circuits.

Tuesday, September 6, 2016

Non-empirical confirmation stinks

Physicist Carlos Rovelli writes:
String theory is a proof of the dangers of relying excessively on non-empirical arguments. It raised great expectations thirty years ago, when it promised to compute all the parameters of the Standard Model from first principles, to derive from first principles its symmetry group SU(3) x SU(2) x U(1) and the existence of its three families of elementary particles, to predict the sign and the value of the cosmological constant, to predict novel observable physics, to understand the ultimate fate of black holes and to offer a unique well-founded unified theory of everything. Nothing of this has come true. String theorists, instead, have predicted a negative cosmological constant, deviations from Newtons 1/r2 law at sub millimeters scale, black holes at CERN, low-energy supersymmetric particles, and more. All this was false.

From a Popperian point of view, these failures do not falsify the theory, because the theory is so flexible that it can be adjusted to escape failed predictions. But from a Bayesian point of view, each of these failures decreases the credibility in the theory, because a positive result would have increased it. The recent failure of the prediction of supersymmetric particles at LHC is the most fragrant [sic] example. By Bayesian standards, it lowers the degree of belief in string theory dramatically. This is an empirical argument. Still, Joe Polchinski, prominent string theorist, writes in [7] that he evaluates the probability of string to be correct at 98.5% (!).
I think he means the failure of SUSY is the most flagrant example. English is not his native language. Or maybe he means that string theory stinks!

At least string theory had some hope of doing something worthwhile, 20 years ago. There are other areas of theoretical physics that have no such hope.

Monday, September 5, 2016

The essence of quantum mechanics

What is the essence of quantum mechanics?

Expositors of quantum mechanics always argue that the theory has something mysterious that is not present in previous theories. For example, Seth Lloyd called it quantum hanky-panky. But what is it?

Usually it is called entanglement. And that is compared to Schroedinger's cat being alive and dead at the same time.

However there is no proof that a cat can be alive and dead at the same time. What we do have is situations where we do not know whether the cat is alive or dead, and we don't need quantum mechanics for that.

No, the essence is that states cannot be equated with observables. And that is because observables do not commute.

If position and momentum commuted, then they could be measured simultaneously, and maybe you could talk about an electron state has having a particular position and momentum. But they do not commute, so position and momentum depend on how they are measured. Any mathematical description of the (pre-measurement) state of an electron must allow for different measurement outcomes.

I agree with Lubos Motl's rant:
The anti-quantum zealots are inventing lots of "additional" differences between classical and quantum physics – such as the purported "extra nonlocality" inherent in quantum mechanics, or some completely new "entanglement" that is something totally different than anything we know in classical physics, or other things. Except that all these purported additional differences are non-existent. ...

Now, the "quantum mechanics is non-local" crackpots love to say not only crazy things about the extra non-locality of quantum mechanics – which I have thoroughly debunked in many previous specialized blog posts – but they also love to present the quantum entanglement as some absolutely new voodoo, a supernatural phenomenon that has absolutely no counterpart in classical physics and whose divine content has nothing to do with the uncertainty principle or the nonzero commutators.

Except that all this voodoo is just fog. ...

The idea that the correlation between the two electrons is "something fundamentally different" from the correlation between the two socks' colors – an idea pioneered by John Bell – is totally and absolutely wrong. ...

Aside from the consequences of the nonzero commutators – i.e. of the need to use off-diagonal matrices for the most general observables that may really be measured – there is simply nothing "qualitatively new" in quantum mechanics. Whoever fails to understand these points misunderstands the character of quantum mechanics and the relationship between quantum mechanics and classical physics, too.
Where I differ from LuMo is in the likelihood of quantum computers.

The premise of the quantum computer is that nonlocal quantum voodoo can be applied to do super-Turing computations for us. Somehow an electron has to be in two places at the same time, doing two different computations.

If you believe that the cat can be alive and dead at the same time, then it is not much of a stretch to figure that a qubit can be in multiple states doing different computations, possibly in parallel universes, and assembling those results into a final observable.

But what if the quantum voodoo is just an algebraic consequence of the fact atomic-scale measurements depend on the order of the measuring?

You can put a pair of qubits in a entangled state where measuring one and then the other gives a different result from reversing the order. That is a consequence of observables not commuting. I accept that. What I do not accept is that those qubits are doing different computations simultaneously.

Thanks a reader supplying the link, the UK Daily Mail reports:
PCs? Google may be on the verge of a breakthrough next year, claim experts

* Quantum machines could open up new possibilities for computing
* Google is aiming to achieve 'quantum supremacy' with a small machine
* Experts say firm could announce a breakthrough in the area next year
* They claim a computer with just 50 quantum bits could outperform the most powerful classical computers in existence

Scientists and engineers around the world are quietly working on the next generation of machines which could usher in a new age of computing.

Even the fastest supercomputers today are still bound by the system of 1's and 0's which enabled the very first machines to make calculations.

But experts believe that drawing on the strange properties of the quantum world can enable computers to break free from these binary shackles, creating the most powerful problem-solving machines on the planet.

Now, according to New Scientist, researchers at Google may announce a breakthrough as soon as next year, potentially reaching what the company terms 'quantum supremacy' before anyone expected.
If achieves quantum supremacy next year, I will have to admit that I am wrong. I may then have to defer to Motl and Aaronson. But I doubt it.

Thursday, September 1, 2016

Quantum Hanky-Panky

MIT professor Seth Lloyd gives this argument for quantum computers:
My motto about this is that if a little quantum hanky-panky will allow you to reproduce a little faster, then by God, you're going to use quantum hanky-panky. It turns out that plants and bacteria and algae have been using quantum hanky-panky for a billion years to make their lives better.

Indeed, what's happening in quantum information and quantum computing is it's become more and more clear that quantum information is a universal language for how nature behaves. ...

There is an old idea, originally due to Richard Feynman, that you could use quantum computers to simulate other quantum systems — a quantum analog computer. Twenty years ago, I wrote the first algorithms for how you could program the quantum computers we have now to explore how their quantum systems behave. ...

Thinking about the future of quantum computing, I have no idea if we're going to have a quantum computer in every smart phone, or if we're going to have quantum apps or quapps, that would allow us to communicate securely and find funky stuff using our quantum computers; that's a tall order. It's very likely that we're going to have quantum microprocessors in our computers and smart phones that are performing specific tasks.

This is simply for the reason that this is where the actual technology inside our devices is heading anyway. If there are advantages to be had from quantum mechanics, then we'll take advantage of them, just in the same way that energy is moving around in a quantum mechanical kind of way in photosynthesis. If there are advantages to be had from some quantum hanky-panky, then quantum hanky-panky it is.
The fallacy here is that our personal computers are already using some quantum hanky-panky. You need quantum mechanics to understand CPU chips, memory chips, disc drives, and most of the other electronics. They use quantum hanky-panky as much as plants do in photosynthesis.

Yes, Feynman was right that you could use a quantum machine to simulate a quantum world, for the simple reason that the world simulates itself, and that using a digital computer is computationally inefficient.

The interesting question is whether a quantum computer can do a super-Turing digital computation. I doubt it.

Lloyd recites how many brilliant physicists have worked on this question for 30 years, and now 1000s are working on it. He compares progress to the early days of digital computers.

I don't buy it. Within about 5 years of building the earliest digital computers, they were simulating the weather and nuclear bomb explosions. The quantum computer folks are still working on connecting one qubit with another. The fact that so many smart ppl are working on it and the results so meager, only shows that it may not be possible.

All of the smart high-energy physicists were believers in SUSY (supersymmetry) for 30 years also, and some of them are just now paying off bets now that the LHC has failed to find any SUSY particles. So yes, when all the experts say that something is theoretically possible, they may be all wrong.

Lloyd explains:
Quantum computers work by storing and processing information at the most microscopic level. For instance, if you have an electron, you could call an electron spinning out like this zero, you could call an electron spinning like that one, and you could call an electron spinning like that zero and one at the same time, which is the primary feature of quantum computation. A quantum bit—qubit—can be zero and one at the same time; this is where quantum computers gain their power over classical computers.
This is like saying Schroedinger's cat is alive and dead at the same time. If we could do that, maybe we could have a horse plow a field and pull a carriage at the same time, and thus do twice as much work.

That is impossible, because it violates energy principles. But Lloyd wants the qubit to do two computations at the same time, and thus twice as much computational work on each low-level operation. String those together, and you get exponentially as much work.

That would violate computational principles, but the quantum computer folks do not believe them.

In fairness, Scott Aaronson would say that I am taking advantage of Lloyd's oversimplified explanation, and that the qubit does not really do two things at the same time. It just does some quantum hanky panky with negative probabilities that is too subtle for fools like me to understand, and that is enuf for a big computational gain.


The quantum computers are now getting a lot bigger. Now, we have tens of bits, soon we'll have 50 bits, then we'll have 500 bits, because now there's a clear path to how you build a larger scale quantum computer.
If they had 50 qubits, they would have enuf for a super-Turing computation. The first physicist to demonstrate that would be a likely candidate for a Nobel prize. Based on the current hype, it should happen in the next year or so.

So if it happens, you will hear about it. Then you can leave a comment here telling me that I am wrong. Otherwise, you should start to wonder if maybe Lloyd and the 1000s of others might be wrong.