og_kalu 1 day ago

>It's meant to be so obviously absurd that a person inside the room, merely substituting symbols, should in any sense understand the meaning of those symbols.

Alright, so what neuron in your brain "understands" English ?. Hell feel free to name any part regardless. This is why the Chinese Room is nonsensical. Either you admit the system can understand even when none of the constituents do or you admit you don't understand anything at all either. At least either conclusion would be consistent.

Unfortunately, we have many people take the nonsensical middle road. "Oh that doesn't understand but I certainly do, just because."

2
throw4847285 1 day ago

I don't understand why people get so up in the arms about the Chinese room. It's very clear that a major part of human intelligence is a mental model of the physical world, and linguistic concepts have an (often complex) relationship to that model. There's no magic here. Nothing about that argument implies anything about neurons. The process of forming a mental model of the world and mapping words onto it could easily take place within many many neurons within the human brain, because it does! It does not take place in an LLM. That does not imply that nobody will ever develop a positronic brain that could do the same. We just clearly haven't done so yet.

Saying, "if you can't point to the neuron that does X, then you can't prove X happens" isn't a scientific perspective. It's a willfully ignorant one. If you're confident in the scientific process, then we will eventually understand how all kinds of human mental processes make sense in the context of neural networks.

og_kalu 1 day ago

The point is that all the Chinese room is is a play to absurdity. That because opening the box reveals mechanisms that we would not call understanding does not mean the system, the Chinese room does not understand. The neuron comparison is to demonstrate that very fact. The brain is a Chinese room. It doesn't have to be relegated to a neuron, feel free to open the box and show any of us what happens in there that we would call understanding.

>It does not take place in an LLM.

I don't know what else to tell you but LLMs absolutely model concepts and the physical world, separate from the words that describe them. This has been demonstrated several times.

mjburgess 1 day ago

The Chinese room does not aim to show, nor does it show, that part-whole relationships fail nor is it even about part-whole relationships.

Yes, neurones do not understand "pen" -- but some highly particular whole bodies do (ie., english spekaing people). That's because of highly particular relationships between those neurones, the body, the environment, and the history of that language user.

This is the csci brain rot that searle is baffled by. Symbol manipulation implies no relationships between wholes and parts. The capcity to understanding meaning requires extraordinarily specific ones.

og_kalu 1 day ago

What is the difference between "English Speaking People" and "the Chinese Room" ? The problem with Searle's arguments is that all the Chinese room is is an appeal to absurdity, a sleight of hand. I'm supposed to think, "Oh. This is so absurd, of course the room doesn't understand" but it is an appeal that falls apart about once you realize that the same logic could be applied to any computational process, including human cognition. The distinction Searle draws between a person who genuinely understands English and a system that mechanically manipulates symbols is, in essence, arbitrary. They are both systems that have demonstrated understanding.