Thursday, April 21, 2016

Two Dogmas of Empiricism

Modern philosophy of science is mainly characterized by its denial of truth. Philosophers are always arguing that everything is subjective, or that math has no foundation, or that the scientific method is invalid, or that philosophical musings are just as good as experimental results. They just had a conference devoted to non-empirical physics, whatever that is.

A core anti-truth document is the 1951 essay, Two Dogmas of Empiricism, by Harvard philosopher and logician W. Quine.

The essay is widely praised, and even "regarded as the most important in all of twentieth-century philosophy".

The title says "two dogmas", but the essay says that they are the same. The dogma is that empirical knowledge can be distinguished from other kinds.

He concedes that some knowledge is true by simple logic, without recourse to empirical investigation, like:
(1) No unmarried man is married.
But then he has a giant brain freeze trying to classify this:
(2) No bachelor is married.
This seems to be true by definition and logic, because "bachelor" is synonymous with "unmarried man". But then he complains:
But it is not quite true that the synonyms 'bachelor' and 'unmarried man' are everywhere interchangeable salva veritate. Truths which become false under substitution of 'unmarried man' for 'bachelor' are easily constructed with help of 'bachelor of arts' or 'bachelor's buttons.' Also with help of quotation, thus:
'Bachelor' has less than ten letters.
Therefore he argues that it is impossible to distinguish logical and empirical truths, and also that the whole program of scientific reductionism is invalid.

That's it. The argument is that stupid. Of course there are distinctions between logical and empirical truths, even tho the issues drive philosophers nuts. See my defense of logical positivism.

The English language is not as precise as mathematical logic. There are lots of synonyms in English, but that does not mean that they are substitutable in all contexts. Sometimes you need a little context to understand which meaning of a term is being used. This is a trivial observation. Those who praise this essay are morons. Or they are leftists who are ideologically opposed to objective truths.


  1. Even in math you need context, it's the only thing that gives math any actual value. If math could not be used to model actual things and take measurements solving problems about actual things, no one would use it except for astrology charts.

    As for mathematical precision, it doesn't mean a thing if the numbers are generated by bullshit. GR does not allow for space to be compressed at all, only curved because of the math. The field equations for Einstein's math are also non linear, so there are no known solutions to a space time containing more than one single mass...(no interactions can be modeled with one thing) yet everyone in physics thinks they confirmed this theory with a 'chirp' of colliding black holes 'predicted by Einstein's math' found by a device on the surface of the Earth which somehow can weed out all extraneous vibrations down to a scale smaller than the width of a proton...utter hogwash. Physicists can't even account for over 95% of the mass in the universe yet they can weed out all possible sources of vibration and confirm colliding black theoretical objects billions of light years away? Right. I was born yesterday. GIGO all the way baby. If your methodology sucks, and your theory is bogus, all the decimal places of accuracy in the world will not save you.

    1. The modern physicists have totally embraced probability. Their reasoning is as follows: a horde of monkeys hammering away on typewriters might produce one of Shakespeare’s sonnets or a TOE. Might as well keep hammering!

  2. Quine is really being quite an imbecile here. What he just tried to disprove is the possibility of human communication. But here we are communicating!

  3. This is what absolutely cracks me up, According to probability, there is no absolute certainty that something will happen...and if the event actually does happen..well technically that's a 100 percent certainty, ...which is impossible according to probability. Sigh. Facepalm. Self evident observation trumps predictive calculation. Always. Math does not inform anything in reality, other way around folks. Given the apparent success of various bullshit to billions research programs, I just need a way to convince the federal government to give me a billion dollars to pretend match a signal (of my own creation) to a database of artists depictions of a unicorn farting in order to prove unicorns are farting in another galaxy. I wont let the tiny little details (that a unicorn has never been observed, observed farting, or had such said farts sniffed) get in my way! Isn't science grand? If this isn't alchemy, what is?

    speaking of strange imaginary odors...
    The only thing detected at LIGO was a match of a purported signal (received mere moments after they turned the damn thing on, oh my stars what are calculated probable odds of that happening...think really big numbers) to a preconceived database of made-up signals based on what they IMAGINED a colliding pair of black holes would look/sound/smell like. Since no colliding black holes have ever been positively observed, much less a black hole (singular) been actually 'observed', I find it almost black comedy that people are convinced an 'observation' or 'detection' has been made at all based on a hypothetical object defined by a theory concocted by Hilbert (not Einstein) which can not produce said hypothetical object without an eternal, asymptotically flat mathematical space with all matter and energy removed (Ric = 0) which can not even accommodate more than one object (its highly non linear). How does one predict anything with a math model purportedly designed to model gravity interactions (the only actual way to detect and measure gravity) which CAN NOT accommodate more than a single source of gravity? Please let me know how this is done, I'd love to publish it Astro-Scatology Today, right after the Brooklyn Bridge resale realestate classifieds.

    1. Continuous probability theory is complete nonsense. Measure theory says zero probability events are possible and one probability events are not guaranteed. It's self-contradictory, just like completed infinity. A far as experiments go, we are getting so refined that the probability of confusion rises substantially. When dust can screw something up, how much further can we go and confirm the steps? It's like checking a really long proof. At some point, you really aren't able to be certain, except maybe with a computer.

      Computers can model anything we can coherently describe with mathematics, including worlds that don't exist (just see a recent Hollywood movie). People have done all kinds of simulations on black holes and galaxies, so don't get hung up on the retarded theorists living in the 17th century and using outdated mathematics. They don't even have knowledge of how to model physics. They're overrated clowns with little understanding of how simulations are actually accomplished. The methods and simulations are more complicated than what they work on but self-consistent ideas can appear before our very eyes. Seeing is really believing. What is true is another matter.

      Non-linearity is actually a good point to make about quantum mechanics, which is supposed to be linear but may only be an approximation. The whole emphasis on particles and probabilities is an interpretation. Here is Hawking in his A Brief History of Time:

      "We now know that Laplace’s hopes of determinism cannot be realized, at least in the terms he had in mind. The uncertainty principle of quantum mechanics implies that certain pairs of quantities, such as the position and
      velocity of a particle, cannot both be predicted with complete accuracy. Quantum mechanics deals with this
      situation via a class of quantum theories in which particles don’t have well-defined positions and velocities but are represented by a wave. These quantum theories are deterministic in the sense that they give laws for the evolution of the wave with time. Thus if one knows the wave at one time, one can calculate it at any other time. The unpredictable, random element comes in only when we try to interpret the wave in terms of the positions and velocities of particles. But maybe that is our mistake: maybe there are no particle positions and velocities, but only waves. It is just that we try to fit the waves to our preconceived ideas of positions and velocities. The resulting mismatch is the cause of the apparent unpredictability."

      The same book also makes admissions that modern scientists just study unobservable extremes and that singularities and mathematical infinities are just artifacts of the mathematical systems without QM and modern methods.