Wednesday, June 3, 2020

How we got theories being called facts

A new paper argues:
The concept of fact has a history. Over the past centuries, physicists have appropriated it in various ways. In this article, we compare Ernst Mach and Albert Einstein's interpretations of the concept. Mach, like most nineteenth-century physicists, contrasted fact and theory. He understood facts as real and complex combinations of natural events. Theories, in turn, only served to order and communicate facts efficiently. Einstein's concept of fact was incompatible with Mach's, since Einstein believed facts could be theoretical too, just as he ascribed mathematical theorizing a leading role in representing reality. For example, he used the concept of fact to refer to a generally valid result of experience. The differences we disclose between Mach and Einstein were symbolic for broader tensions in the German physics discipline. Furthermore, they underline the historically fluid character of the category of the fact, both within physics and beyond.
This is amusing. I am not sure Mach and Einstein were really the trendsetters, but science popularizers today frequently use the term "fact" in a way that was not previously accepted.
Such a comparison assists the study of the modern notion of a "scientific fact," and how and why it should be distinguished from an "alternative fact." This is because Mach and Einstein's concepts of fact were constitutional for later and current notions, also outside of the physics discipline. Mach's fact-oriented empiricism was a primary source of inspiration for logical positivism and conventionalism, which in turn became hugely influential in shaping twentieth-century philosophical debates about realism, the relation between theory and experiment, and the role and status of scientific facts.15 Einstein's physics and philosophy, in particular his theory of relativity and his critique of quantum mechanics, also became an essential point of reference in such debates. What is more, Einstein actively contributed to epistemological discussions himself.16

Half a century ago, Gerald Holton touched upon the main issue addressed by this paper. In 1968, Holton claimed that there was a "divergence between the conception of `fact' as understood by Einstein and `fact' as understood by a true Machist."17 According to Holton, this divergence related to the status of laws, concepts, and principles, which Mach, unlike Einstein, systematically distinguished from facts.
This paper has good historical info on the shift in thinking.

Monday, June 1, 2020

Quantum mechanics is entirely local

Dr. Bee explains:
So, oddly enough, quantum mechanics is entirely local in the common meaning of the word. When physicists say that it is non-local, they mean that particles which have a common origin but then were separated can be stronger correlated than particles without quantum properties could ever be. I know this sounds somewhat lame, but that’s what quantum non-locality really means.
This is correct. A particle has quantum properties if it obeys the Heisenberg uncertainty principle. A particle without quantum properties obeys a classical theory of local hidden variables.

The theory is local. The only claim to non-locality is just a statement about correlations.

Thursday, May 28, 2020

Infinite information density is impossible

New paper:
An argument for an indeterministic interpretation of classical physics (i.e., Newton's mechanics and Maxwell's electrodynamics) was put forth by Gisin and Del Santo in [1] (see also [2] [3] [4] and [6] ). They maintained that although classical physics has traditionally been construed as deterministic (i.e., the physical laws determine a unique definite future (and past) state of a physical system once its current state is fixed, as famously revealed in the scenario of "Laplace's Demon"), it is not necessarily the case. There are metaphysical assumptions behind the traditional deterministic interpretation, and it is possible to give an alternative indeterministic interpretation by revising those assumptions, they contended. In particular, the usual practice that real numbers are used to represent physical quantities was held to be problematic, because this would lead to the unacceptable consequence of "infinite information density" (as related to the infinite string of digits following the decimal point of a real number) in the relevant physical space, according to them.
The paper does not agree with these conclusions, but I do.

I have argued here that classical mechanics is not really deterministic. Calculations always have error bars, just like any other part of science. Laplace's Demon is just a big straw man, like Schroedinger's cat.

Almost all real numbers have infinite information content, and such things are not observable.

Monday, May 25, 2020

Two quantum paradoxes about entanglement

This post is an explanation of a couple of points about entanglement that others get wrong. Part of the confusion is to mix two distinct paradoxes, so I separate them. (Remember a paradox is an apparent contradiction or confusing issue. Valid theories can have paradoxes.)

Uncertainty principle. A core tenet of quantum mechanics is that you cannot measure a particle's position and momentum at the same time. This is because particles are not really particles, and have wave-like properties that prevent having definite values for position and momentum.

Quantum mechanics enforces this uncertainty by using non-commuting observables. Measuring position then momentum is different from momentum then position. Other pairs of observables have this same property, such as Spin-X and Spin-Y.

This is an essential part of quantum mechanics, and was well-understood and non-controversial by about 1927.

Quantum twin paradox. If a system emits two equal and opposite particles, then properties of one can be deduced by measuring the other. For example, since momentum is conserved, the momentum of one will be opposite the other.

If the two particles are far apart, then knowledge about one seemingly has a spooky effect on our knowlegde about the other. This paradox occurs in either classical or quantum mechanics. It doesn't really violate the principle that there can be no action at a distance.

Combining these two paradoxes gives the EPR paradox. The idea is that you can measure the position of particle A and deduce the position of particle B, or you can measure the momentum of particle A and deduce the momentum of particle B, but you cannot measure the position and momentum at the same time.

Einstein argued in the 1935 EPR paper that this makes the theory of quantum mechanics incomplete. That is, you can deduce a particle's position and momentum by measuring its twin, but you cannot measure both at the same time. A complete theory would tell you both at the same time.

Bohm and Bell explain EPR with Spin-X and Spin-Y. You could use any noncommuting variables, as they all satisfy the uncertainty principle. Bohm proposed a nonlocal theory where a particle has a well-defined position and momentum all the time, but those variables might have nothing to do with what is observed. Bell proposed a classical theory of local hidden variables, but those theories have been refuted by experiments.

The EPR-Bohm-Bell followers will tell you that their argument is more subtle than just saying that the uncertainty principle makes quantum mechanics incomplete. That is because the position and momentum (or Spin-X and Spin-Y) are both predictable by measuring the twin particle. But you can't measure both at once in the twin particle, so you cannot predict both at once.

If you are bothered by the uncertainty principle, then you are going to be bothered by any theory were electrons have wave properties. Electrons are observed to have wave properties. If you are bothered by the quantum twin paradox, then you are also going to bothered by classical theories where someone might have info at a distance.

If you are not bothered by either paradox, then it is not clear why you would be bothered by the EPR paradox, because that is just putting the two paradoxes together. But there is a long list of intelligent physicists, from Einstein to Sean M. Carroll, who are tremendously confused by this combination.

Thursday, May 21, 2020

Carroll on many-worlds and entanglement

Physicist Sean M. Carroll has a new video explaining entanglement.

Towards the end, he tells us that he likes the many-worlds interpretation (MWI), but admits that it has two major problems. It doesn't explain anything about the macroscopic world, and it doesn't predict anything.

He does give a tortured argument for assigning probabilities that is supposed to match the Born rule, but he rejected any frequentist interpretation that allows checking the probabilities against experiment.

The MWI is just nuts. Just listen to the advocates try to make sense of it. They are completely unable to make any scientific sense.

I am actually more disturbed by his explanation of entanglement. That is textbook stuff, and he gets it wrong. He says that he is writing an undergraduate quantum mechanics textbook under a contract with a publisher.

So far the quantum mechanics textbooks have been relatively free of the nonsense that Carroll peddles. I hate to think how many people are getting confused by him. I will post more on what he gets wrong about entanglement.

Update: I just listened to another Carroll Q&A. Most of his answers are correct and well-explained, but he sure has some peculiar views. Someone asked if some experiment could distinguish the Copenhagen and Many-Worlds interpretations.

He said no, because the Copenhagen interpretation is not well-defined! He said that no one knows what a measurement is, or any of the other things to make sense out of it.

This is completely crazy. Every QM textbook uses Copenhagen. So do nearly all the research papers. We have trillions of dollars of industry based on QM, from semiconductors to lasers to video screens, and it all uses Copenhagen. None of it uses MWI.

MWI is not well-defined. No one can say what a splitting of universes is, or what a prediction is, or anything that relates to a real-world experiment. There is no experiment that has ever confirmed any aspect of MWI. The problem with an experimental test is that MWI does not make any predictions.

Monday, May 18, 2020

Samsung now makes quantum phones

ExtremeTech reportw:
However, you probably don’t have a quantum 5G phone. ...

The new Galaxy A Quantum is the first and only smartphone with a Quantum Random Number Generator (QRNG) inside. The device itself is based on the mid-range Galaxy A71 5G, which Samsung has already launched in numerous markets sans quantum technology. ...

According to Samsung, the Galaxy A Quantum is much more secure than other smartphones thanks to its QRNG hardware. This is completely separate from the SoC and other core hardware. It’s a tiny embedded chipset called the SKT IDQ S2Q000 that’s just 2.5mm square consisting of an LED and a CMOS sensor. The LED shines into the sensor to produce image noise, and the sensor interprets that as quantum randomness. These random noise patterns become the basis for truly random number strings.

The Galaxy A Quantum will go on sale May 22nd in South Korea for KRW 649,000. That’s about $530, which is a bit more than the A71 5G on which it is based.
Modern electronics uses quantum mechanics for all sorts of things, but this is not really quantum. It is just thermal noise.

Some of the promoters of quantum computing say that quantum computers could be used for random number generation, if nothing else. Of course, chips for reliable random number generation have been available for decades. Random numbers are needed for cryptography, but there are many ways to solve that with old technology.

Friday, May 15, 2020

Bishop excoriated for free will belief

I often attack physicists who preach some mystical view of the world, but I should also credit non-scientists who are completely rational. Here is evolution professor Jerry Coyne's latest rant:
Yesterday, reader Neil called my attention to a particularly galling homily given yesterday by the Right Reverend Dr. David Walker, the Bishop of Manchester. It especially irked me because it was about free will—his idea that we have it in the libertarian form. ...

Walker rejects determinism, claiming that if we have no “choice” whether or not to commit an offense (i.e., the future is preordained), then humans beings “have no moral responsibility for what we do.” ...

Now you may try to tortuously parse the good Reverend’s words to say what he really means is a compatibilistic free will that, deep down, accept determinism of our actions. But I think you’d be dead wrong, for Walker states at the outset that he clearly rejects the mathematically-based determinism of science. No, he’s talking about pure libertarian free will—the kind that his sheep accept.

I’m surprised that, in a country where—although there’s a state church—Christianity is on a precipitous decline, the BBC still emits a “thought for the day” that is invariably religious. Seriously, my UK friends, why does this persist? ...

Sorry, but I can’t happily ignore it. Broadcasting it would be illegal in the U.S. You can happily ignore it, but I’m afraid that I’m not in that boat with you. It enables faith and religion and stupidity. It’s just as if they had an “astrology of the day” thought!
I listened to the podcast, and I cannot find any fault with it. It is consistent with our best scientific theories.

Free will is a philosophical question, so I don't mind if Coyne has a different opinion from mine. But he goes farther, and seeks to censor alternate views, under the guise of purging unscientific thought.

I believe in free will, as I believe that personal experience in favor of it is compelling. If science had somehow proved determinism, I would have to reconsider, but the evidence is just the opposite. All the scientific evidence is against determinism.

Free will is essential to Christian ideas about moral responsibility, and to scientific underpinnings of experimentalism.

The determinist objectors to free will tend to degenerate into discussions like this:
{Responsive comment] To be fair to him, it’s not as though the Bishop could have chosen NOT to believe in libertarian free will.

[Coyne] And I could not have chosen not to excoriate him.

[Another comment] cannot argue with that…even if I so desired.
In my opinion, this is just philosophical silliness, and doesn't address what the Bishop said.

Monday, May 11, 2020

Physicists promoting fashionable nonsense

Jean Bricmont, Sheldon Goldstein, and Douglas Hemmick write in a new paper:
Hence, said EPR, in one of the most misunderstood, yet simple, arguments in the history of physics, the quantum state is an incomplete description of physical reality. It does predict the correct statistics, but does not describe completely the physical state of individual systems. Stated more precisely, it says that we must describe this pair of particles not only by their joint quantum state but also by other variables, often called "hidden," that determine the behavior of those particles when one measures their spin in a given direction.

What could be wrong with this conclusion? In 1964, John Bell showed that simply assuming the existence of these variables leads to a contradiction with the quantum predictions for the results of measuring the spin of those particles in different directions, one for the first particle and another for the second one (see [17] for a simple proof of this contradiction). Those predictions have been amply verified after Bell's publication (see ([21] for a review).

But what does this imply? That we have no choice but to accept the analogue of the second branch of the alternative proposed about the coin tosses of Alice and Bob: that the measurement of the spin on one side affects instantaneously (if the measurements on both sides are made simultaneously), in some way, the result on the other side. This is what is called nonlocality or "action at a distance."
This is nuts. How do respectable academics get away with writing voodoo papers?

Jean Bricmont wrote a book with Alan Sokal criticizing Fashionable Nonsense, meaning academics citing pseudoscience to support wacky ideas. Bricmont himself is a prime example of this nonsense.

He is entitled to his opinion, of course, but he is just wrong when he says that no other view exist. Here is how they start the paper:
Let us start with a physically classical situation: consider the proverbial Alice and Bob, situated far away from each other, and simultaneoulsy tossing coins, over and over. One would expect the results on both sides to be random and uncorrelated. But suppose that the results appear indeed random but are also perfectly correlated: each time Alice's toss results in heads, Bob's toss also results in heads and similarly for tails.

Obviously such a strange situation would cry out for an explanation. One possibility is the following. First, Alice and Bob are able to manipulate their coin tosses so as to obtain whichever results they desire and second, they agree in advance on an apparently random sequence of results and manipulate their coin tosses so as to both obtain that sequence when they toss their coins.

This looks extravagant, but is there any other possibility? Well, yes, there exists an even more extravagant one: that when Alice tosses her coin, she instantly affects the trajectory of Bob's coin, so that Bob's coin falls on the same side as Alice's coin.

Of course, this looks even more incredible than the previous scenario. But we may still ask: is there a third possibility? We don't see any and we will assume from now on that the reader agrees with us on that point.
No, I do not agree. The third possibility is that Alice's and Bob's coin tosses have a common cause.

That is, someone tosses a coin, duplicates it, and hands one copy each to Alice and Bob. Then Alice and Bob each appear to be getting random tosses, except that the outcomes are correlated.

This is what happens in the EPR experiments. Two distant particles are measured, but they were both emitted from the same source simultaneously.

The paper goes on to argue for Bohmian mechanics, as a nonlocal hidden variable theory. That theory agrees with quantum mechanics for some simple systems at least, so they are free to believe in it if they wish. But in all cases, local theories better job of explaining the data. They are just wrong to deny the possibility of local theories.

Bricmont previous wrote History of Quantum Mechanics or the Comedy of Errors:
The goal of this paper is to explain how the views of Albert Einstein, John Bell and others, about nonlocality and the conceptual issues raised by quantum mechanics, have been rather systematically misunderstood by the majority of physicists.
This paper argues that most physicists, textbooks, Nobel prizewinners, etc. are all wrong about quantum mechanics, because they do not believe in spooky action.

No, the textbooks are not wrong.

Bricmont quotes David Mermin:
To those for whom nonlocality is anathema, Bell’s Theorem finally spells the death of the hidden-variables program. ... Many people contend that Bell’s theorem demonstrates nonlocality independently of a hidden-variables program, but there is no general agreement about this.
That is correct. If you choose to believe in nonlocality, then it is possible to have a theory of nonlocal hidden variables like Bohm's. If you accept locality, then the hidden variables program is dead. A few physicists believe that Bell somehow proves nonlocality, but they are wrong.

Friday, May 8, 2020

The Dream Universe

Dr. Bee book review:
The Dream Universe is about “how theoretical physics is returning to its unscientific roots” and that physicists have come to believe
“As we investigate realms further and further from what we can see and what we can test, we must look to elegant, aesthetically pleasing equations to develop our conception of what reality is. As a result, much of theoretical physics today is something more akin to the philosophy of Plato than the science to which the physicists are heirs.”
He then classifies “fundamental physics today as a kind of philosophy” and explains it is now “less about a strictly rational understanding of the universe and more about finding a scenario that we deem intellectually respectable.” He sees no way out of this situation because “Observation, experiment, and fact-finding are no longer able to guide [researchers in fundamental physics], so they must set their path by other means, and they have decided that pure rationality and mathematical reasoning, along with a refined aesthetic sense, will do the job.”

I am sympathetic to Lindley’s take on the current status of research in the foundations of physics, but I think the conclusion that there is no way forward is not supported by his argument. The problem in modern physics is not the abundance of mathematical abstraction per se, but that physicists have forgotten mathematical abstraction is an end to a means, not an end unto itself. They may have lost sight of the goal, alright, but that doesn’t mean the goal has ceased existing.
I wrote a similar book several years ago, and I place the blame on Einstein worship, not math or theory.

While Einstein did much worthwhile work, he is largely idolized today by anti-positivists who believe that finding reality is based on finding aesthetically pleasing equations. You can see this most explicitly in how he is credited for relativity. He is almost never credited for what he actually did. He is either credited for the work of others, or for some anti-positivist philosophy.

Wednesday, May 6, 2020

Greene video pushes action-at-a-distance

Brian Greene has a series of educational physics video, and his latest is on Bell's theorem:
Albert Einstein and his colleagues Podolsky and Rosen proposed a simple way to rid quantum mechanics of its most disturbing feature--called non-locality--in which an action undertaken here can affect the result of a measurement undertaken there, even if here and there are far apart. John Bell came up with a way to test Einstein's vision of reality, ultimately showing that Einstein's vision was wrong.
That text is correct, but if you listen to the video, Greene says that the world was proved to be nonlocal. He says measuring the spin of a particle can have an effect on a distant particle.

Greene sees the big issue as to whether a spin measurement is a random event at the time of the measurement, or it is predetermined in advance.

Briefly, quantum mechanics is somewhat strange because electrons act like waves, and you cannot measure their position and momentum at the same time. Einstein and others had an idea for replacing quantum mechanics with a classical theory of hidden local variables, because that would be more compatible with his determinism prejudices. Bell and subsequent experiment proved that all those classical theories do not work. The world is quantum.

Bell did not show any nonlocality. He only helped show that the classical theories don't work. Almost everyone was convinced of that in 1930 anyway.

Watching this video will just get you confused. There is no action-at-a-distance.

Greene is very good at explaining a lot of physics, but he really goes off the rails when he talks about Bell's theorem, many-worlds, or string theory.

Thursday, April 23, 2020

Aaronson's world is crashing

I am afraid Scott Aaronson is losing it:
But I didn’t bash Devs. I watched a show that doesn’t merely get a few details wrong, but that’s entirely about taking a steaming dump on everything that I’ve spent my entire life trying to get through people’s heads—e.g., that quantum computers are not magic oracles, that they’re interesting because the stock sci-fi plots that you already knew don’t map onto them, because they illustrate how the actual world is more imaginative than our tropes, and also, that the people who work on these topics are something like the characters on “The Big Bang Theory” but nothing whatsoever like the characters on spooky dramas — and I described the show on my blog as “not that bad” (because it wasn’t). Do you have any idea what an effort of will that took? 🙂

Look, I’m going through a deep depression right now. Indeed, I’m finding it hard to understand anyone who isn’t depressed, given the terrifying state of the world, the morgues running out of room for more corpses, the collapse of people’s plans for their lives, the food deliveries that ominously no longer show up as the machinery of the world starts groaning to a halt, the clowns running wild in the control room, how easily this all could’ve been prevented but wasn’t, one’s own personal failure to foresee it. Why were we fated to be alive right now, to try to raise children now, right when the music of civilization finally stopped?
He is referring to Devs, a fictional TV show on Hulu. The clowns are Donald Trump and his associates in the White House. The collapse is the Wuhan virus lockdown.

Aaronson is a respected computer complexity theory professor, and it known to the wider public for his stereotypical nerdiness, and his futile attempts to both hype quantum computing as the greatest advance in the history of civilization, along debunking nearly everyone else's explanation of how it might possibly do something useful.

I watched a few episodes of Devs, and I am not sure why it is any more aggravating than The Big Bang Theory. In one scene, a professor explains the von Neumann Wigner interpretation of quantum mechanics, while the know-it-all student shouts obscenities in favor of Everett many-worlds. Okay, it is a caricature, but it is TV fiction, and the real-life physicists are as ridiculous as some of the TV explanations.

Update: Aaronson posted again, with weird paranoid beliefs. He also said the leader of Google's quantum computing effort has quit, apparently over disagreements over what can be done. My guess is that management was pressuring him to get results, and he saw that the hype could not be continued indefinitely. Some careers could go south when the project fizzles.

Thursday, April 16, 2020

John Horton Conway dies

NY Times obituary:
John Horton Conway, the English-born Princeton mathematician whose body of work ranged from the rigorously highbrow to the frivolously fun, earning him prizes and a reputation as a creative, iconoclastic and even magical genius, died on Saturday in New Brunswick, N.J. He was 82.

His wife, Diana Conway, said his death, at a nursing home, was caused by Covid-19. ...

One of Dr. Conway’s favorite accomplishments was the Free Will Theorem, conceptualized casually over the course of a decade with his friend and fellow Princeton mathematician Simon Kochen and first published in 2006 (and later revised).

The theorem, simply put, is this: If physicists have free will while performing experiments, then elementary particles possess free will as well. And this, Dr. Conway and Dr. Kochen reckoned, probably explains why and how humans have free will in the first place.
I think this theorem is an important insight, but others downplay it, such as Scott Aaronson:
Closest to my wheelhouse, Conway together with Simon Kochen waded into the foundations of quantum mechanics in 2006, with their “Free Will Theorem”—a result Conway liked to summarize provocatively as “if human experimenters have free will, then so do the elementary particles they measure.” I confess that I wasn’t a fan at the time—partly because Conway and Kochen’s theorem was really about “freshly-generated randomness,” rather than free will in any sense related to agency, but also partly because I’d already known the conceptual point at issue, but had considered it folklore (see, e.g., my 2002 review of Stephen Wolfram’s A New Kind of Science). Over time, though, the “Free Will Theorem” packaging grew on me. Much like with the No-Cloning Theorem and other simple enormities, sometimes it’s worth making a bit of folklore so memorable and compelling that it will never be folklore again.
Really? Free will is just freshly-generated randomness?

In a sense, that is right. If I make decisions out of my free choice, they appear to be freshly-generated randomness to someone else who cannot predict what I do.

If he can predict what I do, then I don't have free will. So yes, you can say free will is nothing but freshly-generated randomness, but that is just a linguistic trick for devaluing it.

(Off-topic, a comment says about the corona virus, "This is going to be on the order of a standard flu season." Scott compares this to "Holocaust denial". Wow, a lot of smart people have gone mad. It does appear that the COVID-19 death total will be comparable to a bad flu season. Yes, Conway is reported to have died of COVID-19, but he was age 82 and living in a nursing home. Most of those who die of COVID-19 have multiple other health issues contributing to the death.)

Monday, April 13, 2020

Reductionism is firmly established

Dr. Bee defends reductionism:
A lot of people seem to think that reductionism is a philosophy. But it most definitely is not. That reductionism is correct is a hypothesis about the properties of nature and it is a hypothesis that has so far been supported by every single experiment that has ever been done. I cannot think of *any scientific fact that is better established than that the properties of the constituents of a system determine how the system works. ...

Indeed, the whole history of science until now has been a success story of reductionism. Biology can be reduced to chemistry, chemistry can be reduced to atomic physics, and atoms are made of elementary particles. This is why we have computers today.
I agree with this, but see this recent paper:
The relationship between the chemical picture of an isolated molecule and ttuff.hat arising from the eiegenfunctions of the Schrodinger Coulomb Hamiltonian [for] the isolated molecule are examined and discussed.
Apparently the program to reduce chemistry to atomic physics has run into some problems. We still need chemistry, because the atomic physicists cannot explain some basic stuff.

Thursday, April 9, 2020

Dr. Quantum Supremacy goes full panic

The Wuhan China virus is driving a lot of people batty, and now it has driven Scott Aaronson over the edge:
If the pandemic has radicalized you, I won’t think that makes you crazy. It’s radicalized me, noticeably shifted my worldview. And in some sense, I no more apologize for that, than I apologize for my worldview presumably differing from what it would’ve been in some parallel universe with no WWII.
Meanwhile, Lubos Motl thinks that he is our last sane man.

Hope you all are well. These guys seem to be living in parallel universes. This is the best argument for many-worlds I have seen yet!

My gut feeling that when this crisis is all over, it will be clear that everyone overreacted.

Monday, April 6, 2020

What is Mathematics?

A Math blog asks what is mathematics, and only provides silly answers:
Any argument carried out with sufficient precision.

Using arguments with more than two steps.

Mathematics is what mathematicians do.

Mathematics is the branch of natural philosophy that concerns itself with only making true statements.

Mathematics ought to be considered as a set of precise, symbolic, languages that serves as a lingua franca for the physical sciences.
All of these answers are unsatisfactory.

Mathematics is knowledge obtained by logical proofs.

Saying that mathematics is a language is like saying music or philosophy is a language. Sure they use language to communicate, but so does everyone else.

Using phrases like "sufficient precision" ignores the fact that some arguments are proofs, and some are not. Math demands proofs.

Saying "true statements" comes the closest to describing math, but of course many other fields claim to be finding truth. Only math finds it with logical proofs.