Statistician Andrew Gelman endorses this:
Probability is a mathematical concept. To define it based on any imperfect real-world counterpart (such as betting or long-run frequency) makes about as much sense as defining a line in Euclidean space as the edge of a perfectly straight piece of metal, or as the space occupied by a very thin thread that is pulled taut. Ultimately, a line is a line, and probabilities are mathematical objects that follow Kolmogorov’s laws. Real-world models are important for the application of probability, and it makes a lot of sense to me that such an important concept has many different real-world analogies, none of which are perfect.Mathematically, I agree with him. Those axioms do not even have anything to do with what non-mathematicians think of as randomness.
The curious thing is that he believes that quantum mechanics violates the laws of probability. This is because he was a physics major before becoming a statistician, and his professor gave him a faulty explanation of the double slit experiment.
The faulty explanation sometimes goes under the name "quantum logic". It says that the double slit experiment proves that the law of the excluded middle is false. You expect that if you fire a particle at a double slit, it goes thru one slit or the other. But you see an interference pattern, and not just the logical sum of particles going thru one slit or the other.
The much more reasonable explanation is the the particle is not a classical particle, but a wave. Waves show interference patterns, and nothing is surprising.
So which would you rather believe, that light has wave properties, or that the mathematical laws of logic and probability are broken?
Obviously, light has wave properties. The alternative is lunacy. It is like having a theory that predicts 2+2 and measuring 4, and then concluding that 2+2 is not 4. No, the better conclusion is that something is wrong with your theory or your measurement.
The theorems about conditional probability and the law of the excluded middle are mathematically valid, and as true for quantum mechanics as anything else.
Gelman says that the probability of the particle hitting a patch on the target screen should be the average of the conditional probabilities, where the condition is passing thru one slit or the other. But such a formula would not give an interference pattern.
His error is thinking that a particle goes thru one slit or the other. Quantum mechanics says that it does not.
When he previously posted about this, others tried to explain the physics to him. But he was not interested in the physics of quantum mechanics. He is interested in applying statistics to social sciences. If conventional probability theory is not good enough for physics, then he wonders whether there should be a generalized probability theory to cover both physics and social sciences.
The physics problem is that unrealized experiments have no results. Maybe there is an analogous principle in the social sciences, I don't know.
For how different people view probably, see this:
Michael Lewis's book "The Undoing Project" is concerned with the (mathematical) psychologists Daniel Kahneman and Amos Tversky. (Kahneman won the 2002 Nobel Prize; Tversky died in 1996.) On page 157, this question is quoted:So he got a Nobel Prize (actually a Bank of Sweden prize in economics) for saying people are irrational for reasoning differently from him.
The mean IQ of the population of eighth graders in a city is known to be 100. You have selected a random sample of 50 children for a study of educational achievement. The first child tested has an IQ of 150. What do you expect the mean IQ to be for the whole sample.
Tversky and Kahneman stated: "The correct answer is 101. A surprisingly large number of people believe that the expected IQ for the sample is still 100" in Psychological Bulletin, vol. 76, 105--110 (1971).
I do not think I would give 101 as the answer. It is much more likely that the test was mis-normed, or that the children were unusual, or that the sampling was not random. NN Taleb mocks this thought experiment with a more extreme example:
Fat Tony is the foil to Dr. John. Dr. John is nerdy, meticulous, careful and academic; Fat Tony is confident, loud, careless and shrewd. Both of them make errors, but of different types. Dr. John can make gigantic errors that affect other people by ignoring reality in favor of assumptions. Fat Tony makes smaller errors that affect only himself, but more seriously (they kill him). ...Fat Tony deduces that the coin is not really fair, and says that heads is much more likely.
The most famous contrast between the two is the question of what to think about a fair coin that has tossed heads 99 times in a row. Dr. John insists that because the coin is fair, the answer has to be 50%.
This dichotomy in thinking goes back to Plato and Aristotle. Plato would make purely abstract theories, and doggedly insist on them. Aristotle was the empiricist. If theory differed from practice, Plato would side with the theory, and Aristotle with the practice.
Update: Gelman has commenters who make the usual mistakes about Bell and interpretations of quantum mechanics. Some say that Bell proved that locality implies an inequality inconsistent with QM, so the experiments prove nonlocality. Actually Bell assumed local hidden variables, so the experiments certainly do not prove nonlocality. They only show that local hidden variable models don't work.