ht-syseng 7 days ago

The word is just a 'pointer' to the underlying shared experience, so I don't think you could; the kid would come away thinking red is the same thing as the feeling of splinters or the warmth of a sunset, which isn't what red is, those are just feelings maybe associated with red. That said - I'm actually pretty confident we'll be able to have basic "conversations" made of basic snippets of information with dolphins and whales in my lifetime. Maybe not complex grammatical structure we identify with, but small stuff like: "I'm hungry". I'm not sure if dolphins could understand "fish or octopus for dinner?", because they might not have any idea of a logical 'OR', and perhaps they might don't even differentiate between fish/octopus.

We do share (presumably) experiences of hunger, pain, happiness, the perception of gradations of light and shape/form within them, some kind of dimensionally bound spatial medium they exist in as an individual and are moving through - though of course they might not conceive of these as "dimensions" or "space", they would surely have analogs for directional words - although given they aren't constrained to live on top of a 2D surface, these might not be "up", "down", "left", "right", but something in the vein of "lightward" or "darkward" as the two main directions, and then some complicated 3D rotational coordinate system for modeling direction. Who knows, maybe they even use quaternions!

1
srean 7 days ago

Very poetically put and absolutely agree.

For the subset of shared experiences and emotions this should be possible, not only that, I feel that we must try (as in, it's a moral/ ethical obligation).

Training an ML on dialogues alone will not be enough. One would need to spend a lot of time to build up a wealth of shared experiences, so that one can learn the mapping/correspondence.