Brain scientist do not buy into this quantum stuff, so I thought that they believe in bits. But apparently not:
Most neuroscientists believe that the brain learns by rewiring itself—by changing the strength of connections between brain cells, or neurons. But experimental results published last year, from a lab at Lund University in Sweden, hint that we need to change our approach. They suggest the brain learns in a way more analogous to that of a computer: It encodes information into molecules inside neurons and reads out that information for use in computational operations. ...I guess not much is known about how the brain stores information.
From a computational point of view, directions and distances are just numbers. ...
Neuroscientists have not come to terms with this truth. I have repeatedly asked roomfuls of my colleagues, first, whether they believe that the brain stores information by changing synaptic connections — they all say, yes — and then how the brain might store a number in an altered pattern of synaptic connections. They are stumped, or refuse to answer.
Roger,
ReplyDeletePeople are not computers. You have as much in common with an electronic binary computer design-wise as you do with an abacus, a pocket watch, or a mechanical adding machine.
Binary computation/computers and human intelligence have little in common structurally for many reasons more than just what they are made of. Binary computation was decided upon as the most efficient design for electronic computing machines on engineering grounds because an on off signal is much easier to differentiate than three, four, or ten or more different states, and because binary computation is mathematically the simplest logical structure that can still perform computation (thank Turing), while human neurology operates quite differently . Neurons and how they connect and signal and are signaled in return are far more complicated than binary logic gates. They do not operate on a 'on/off' basis, and can have many different levels of excitation in ways still not really understood much less modeled. Computer neural nets designed to mimic only a few of these functions are not even close to mimicking even a few interconnected neurons in a rat's brain. You can not compare binary bits or digital memory allocations with what a brain does.
As you have said on numerous occasions, we can't even fully mathematically model a single electron, much less a neuron. Much, much less a group of neruons, and nowhere in the ballpark for even a single section of brain, and nowhere in the same galactic cluster for a human intelligence.
Okay, fine, but when someone memorizes 1000 digits of pi, that info has to be stored somewhere somehow.
ReplyDeleteRoger,
ReplyDeleteThink of it this way, look at how much memory is required to store a digit in eight bits. Now look again how much memory is required to store the same digit in sixteen bits, thirty two bits, sixty four bits, now one hundred and twenty eight bits. The problem is, the more memory you wish to access, the larger number of bits required to do so...in binary logic systems. We know so little about how the brain works, and the only actual 'where' we know is that it is located somewhere between a human's ears. 'Somewhere' and 'somehow' as you put it leaves a universe sized question mark.
Because of the 'holographic-like' properties of human memory, a 'bit' as you put it might be stored in some manner that does not obey the utterly ridiculous world view of 'the universe as a great big binary computer and everything as information meme'.
The word 'red' in English is three letters long. Many people think this means that 'red' therefore is only three character bits of information (many mathematicians think so). But what if you are French? 'Rouge' is five letters long. Same word meaning but different length (requiring more bits of memory in a computer to store), different language...and here's the fly in the ointment no one is seriously talking about, neither 'red' or 'rouge' actually contains any information about the color whatsoever, it is merely a label. The information of 'red' as you perceive it is contained in your head. The words, in whatever language, trigger your understanding of the color which is associated with a concept which is drawing from your associated collective memories. So the problem becomes in actuality, how many bits of computer memory are required to store a trigger for a concept in a human brain, not how much information is actually stored in the word itself. This is why mathematical models of AI are such a bust. An intelligence plays the game of chess, an intelligence is not the chess game.
If you sing the 1000 digits of pi (my mother can sing about fifty) you encode the digits in yet another format altogether that has nothing to do with digits at all, but has rhythmic properties of sound. In truth, we have no way of knowing how many bits are required to store actual information at all. We only know of how many or few characters on paper, impressions in clay, markings on stone, or bits in a computer are required to encode data that a human can understand.