Uncommon Descent Serving The Intelligent Design Community

Language: Did sounds or words come first?

Share
Facebook
Twitter
LinkedIn
Flipboard
Print
Email

From Phys.org:

“It may seem counterintuitive, but it is not quite as simple as saying sentences evolved before grunts,” Collier explained. “Animal calls or grunts most probably existed before ‘sentences.’ Most of these calls do not have meaning in the way that human words have meaning. A few have what we call functional reference, where they seem to denote an external object or event, such as a leopard for example. However, these calls cannot be decomposed into smaller sounds. They come as a single unit, unlike our words that are made up of several sounds that are reused in many different words. This is why we argue that there are no known examples of phonology in animal communication. On the other hand, as discussed in our paper, several species seem to combine these referential calls together to obtain new meanings in a similar way to very simple sentences in human language, which is why we argue that they may have a form of rudimentary syntax.

Further evidence in support of the idea that syntax evolved before phonology in human language comes from analyzing a variety of human languages themselves, including sign languages. As far as linguists know, all human languages have syntax, but not all have phonology. The Al-Sayyid Bedouin Sign Language (ABSL) used by a small society in the Negev region of Israel is an emerging language that has been around for less than 75 years. Interestingly, it does not have phonology. For the ABSL, this means that a single object can be represented by a variety of hand shapes. However, the ABSL still has syntax and grammatical regularity, as demonstrated by the existence of rules for combining signs. Perhaps the presence of syntax but not phonology suggests that syntax originates first in the evolution of a young language, and perhaps also that it is simpler than phonology.

When looking at this hypothesis more closely, many aspects of it make sense. From a cognitive perspective, syntax may be simpler to process than phonology because it is easier to remember a few general rules than many phonemes. Having syntax allows speakers to express many concepts with only a few words. As language develops further, and still more concepts need to be communicated, phonology emerges to provide a larger vocabulary. The evolution of phonology may also be strongly influenced by cultural, rather than biological, evolutionary processes. The researchers hope to further develop these ideas in the future. More.

Examining wordless human languages may provide more useful answers than studying chimp pan hoots or monkey pyows.

See also: Why human language is hard to address in a mechanistic way

Follow UD News at Twitter!

Comments
A few years ago I had a short dream in which I was a dog, a most uncanny and unsettling experience that will probably never leave me. I walked and ran and saw. But no decision-making on direction or sense of purpose. I didn't think. Can thought, as we experience it, even exist without language? Surely a new-born baby reacts only instinctively to stimuli?Paul White
July 12, 2014
July
07
Jul
12
12
2014
07:36 AM
7
07
36
AM
PDT
Getting rather carried away now... Other observations that tend to refute an underlying analytic basis for comprehension: 1. We do not always remember verbatim what we heard - just the main sense of it. For example: "..... because her Mum said so" is later reported as "..... her mother said so". Incidentally, the talent for reporting verbatim is far from universal. 2. Effective comprehension can be severely degraded if only the vocal component is received, such as on the telephone when facial expressions and other gestures are absent. 3. All kinds of noise are filtered out by the listener: (a) So often we hear what we expect to hear, not what was said. (b) An intrusive noise may mask one or more words that the brain easily reconstructs without the listener even being aware of the fact. (c) A mis-pronounced or inappropriate word in relatively non-significant context is simply ignored. (d) A non-sequitur is often dismissed (though sometimes stored up for later clarification). 4. So much of the foregoing points to verbal communication being a holistic skill. This is closely paralleled by: (a) Other animal communication in the wild... "climb on my back", "danger in the air", "I'm really hungry"; rather than "this is a stick" + "poke it in there" + "pull it back out" + "eat ants". 5. The abstract: (a) Analytical and logical thought processing is a learnt skill, taking place (if at all) at a rather late stage of development. Why should this, for language only, be so easy for pre-school children? (b) Babies can enjoy "easy listening" music at birth or before. From teens onwards it becomes possible to "appreciate" "cerebral music"... abstract art, etc, etc. Must stop there and wait for some howls of protest.Paul White
July 12, 2014
July
07
Jul
12
12
2014
07:23 AM
7
07
23
AM
PDT
I'm not even remotely qualified to pontificate on linguistics, but have read much and understand a little. It has been said (somewhere?) that we perceive and interpret speech in large units, such as a phrase or clause at a time. This seems to confirm a huge difference between baby-speak and later (adult-like) communication mechanisms. Babies learn one word at a time; so do adult learners of a new language, but they start out with virtually no faculty for making much sense of complete utterances. Young children eventually get the hang of syntax - apparently - and start to form well-structured speech. Though much later than they seem to comprehend it. Adult learners *may* just develop complex language by constant repetition of key phrases, gradually building a useful stock of standard utterances and responses, gradually expanding the size of memorised units. That is my own conclusion from late-in-life attempts with an Asian language, and very frustrating it is. This very much chimes with the observation (somewhere?) that a large part of normal conversation is formulaic, and may help to explain another most curious phenomenon. With drunken, or otherwise distorted speech, especially the kind of "super-lazy" quick responses all too common in sloppy dialogue, the listener can usually cope with an astonishing degree of corruption from standard phonology. For example, one may ask "Are you going out?" but receive back something like "sh'thin'so, s'rain'n". Obviously the first speaker is programmed to expect a range of positive and negative responses, and that includes a range of common qualifications and justifications. It looks to me as if phonetic and grammatical analysis play absolutely no part in this kind of interpretation, which is based entirely on a highly-fuzzy pattern-matching ability. Tantalizingly, a yes-no answer can often be inferred from as little as a (tonally-significant) noise such as "nnnnnnnn" with a slow downward tone, or "hhh" with a short rising-falling profile. Is there a case for suspecting that language typology in all its manifestations is more-or-less historical accident (c.f. other cultural variations like costume) that has little to with functional needs?Paul White
July 12, 2014
July
07
Jul
12
12
2014
05:50 AM
5
05
50
AM
PDT
Mung Thoughts came first. The sounds are the only way we can express thoughts of our hearts/thinking. There was no actual lag in eden. Our sounds were organized right awqy and so there might be more underlying all this. I say the clue is that our tones of voice are more important then our words. It is the tones that is the origin of music. Words are just pieces of tones possibly. Just broken up tones and then all is memorized.Robert Byers
July 5, 2014
July
07
Jul
5
05
2014
12:49 AM
12
12
49
AM
PDT
If sounds came first all our words would sound like farts.Mung
July 4, 2014
July
07
Jul
4
04
2014
10:53 PM
10
10
53
PM
PDT
I agree with posters here that thoughts came first. Using sounds to express them came second. however in Eden it was quick as a moment. It is all about memorizing. Thats why kids so easily learn languages. Its easy to memorize the sounds order to express thoughts. Its impoassible language evolved from animal grunts. Its organization and then memorizing that demands prior understanding that one wants to express thoughts. not mere desires . How could primates organize together all the sound combinations even for limited speech. We would have trouble if we were all struck mute and then had to start over with speech. the complexity of human language follows human thought complexity. No way a primate would ever start up a language for minor needs. language is a good point for Genesis.Robert Byers
July 3, 2014
July
07
Jul
3
03
2014
07:16 PM
7
07
16
PM
PDT
Babies begin composing thoughts before they are born. In what language do they compose thoughts? Can't be sign language, because they don't have anyone to sign to, unless they have a twin. And can't be grunts because their airways are clogged with fluid. So what are their first thought words? Mama? Warm? Snuggle? There is a tribe on the north coast of South America that formed from individual African slaves who escaped into the jungle. Because they were not all from the same tribe, they didn't share a common language. So they invented a combined language. And one of the observations by Europeans who dealt with them was that the CHILDREN invented/imposed the grammar, imposing order on the composite nouns and verbs (ALWAYS nouns first). Humans are wired for language. Young children make every sound that they hear their parents make before the child starts to talk. This takes some time because the parents of course don't speak Baby, the language in which the child composes her thoughts. So Baby must learn English, etc., as a SECOND language. Shows how smart babies are.mahuna
July 3, 2014
July
07
Jul
3
03
2014
04:28 PM
4
04
28
PM
PDT
When looking at this hypothesis more closely, many aspects of it make sense.
Not as much sense as Werner Gitt's Theory of Information which follows scientific logic and not imaginary speculation.
From a cognitive perspective, syntax may be simpler to process than phonology because it is easier to remember a few general rules than many phonemes. Having syntax allows speakers to express many concepts with only a few words. As language develops further, and still more concepts need to be communicated, phonology emerges to provide a larger vocabulary.
Wouldn't it make more sense for processing syntax and processing phonology be totally unrelated designs, with the benefit of the former design optimizing the efficiency of the latter design? Syntax requires what Gitt refers to as the statistical level (phonology, alphabet, sign language, other kinds of signals, etc) but syntax doesn't come from it. Syntax rides on top of phonology because each level of information was designed top down and is implemented bottom up.
The evolution of phonology may also be strongly influenced by cultural, rather than biological, evolutionary processes.
Doesn't it make more sense for the evolution of phonology to have happened quickly, like in one day when all humanity resided in the area of Babel; Thereby making cultural processes strongly influenced by phonology, since each grouping of humans with the same language would need to congregate within the same geographically area in order to survive.
The researchers hope to further develop these ideas in the future.
The researchers hope to muddy the waters a bit longer until additional funding can be extracted from a non suspecting public.awstar
July 3, 2014
July
07
Jul
3
03
2014
08:02 AM
8
08
02
AM
PDT

Leave a Reply