Hacker News new | past | comments | ask | show | jobs | submit login

What do you think of http://en.wikipedia.org/wiki/Nicaraguan_Sign_Language ?

It certainly makes me doubt language (the technology of it) as purely cultural and learned.




Language is a set of behaviors Behavior for humans and other animals requires a brain and body (almost always) The organisation, growth, and function of the brain and body are influenced by genetics.

So we can say first and formost that language is influenced by genetics. Having a tongue, vocal chords, hands, ears. These all help to acquire language.

So now the question is to what degree does genetics influence language and language acquisition?

Is it fully governed by genetics? We know that removing a child from the company of others during development eliminates complex language and severely stunts their ability to acquire language. So, in a single individual, language has never been observed to arise spontaneously and the facility to acquire language is something you can lose.

If you damage certain portions of the brain aspects of language can be removed. See Fluent Aphasia for a particularly odd example. However, these aphasias are known to remit and studies show that other portions of the brain have taken over from the damaged portion. So we know that the capacity for language is not entirely localized to a genetically determined location.

We also know there was a time when there wasn't language and now there is, so at least once in human history language did arise spontaneously. The article you mention is evidence that language can evolve spontaneously in groups of humans. In fact there is evidence that the rudiments of language arise in groups of many animal species including apes, whales, and dolphins. Many animals communicate, but with insufficient sophistication as to be described as a language.

So it is safe to say that there are genetic factors in producing communicative behavior. Sounds, postures, marking etc. I feel it is also safe to say that there are genetic factors that predispose groups of some species to develop more complex systems of communication (but these are not limited to humans) and that groups of humans are particularly good at complex communication.

Unfortunately for Chomsky there is little to no evidence that any one type of language is more likely than another. Grammars, phonemes, words, abstract concepts ... all of these have huge between language variation. The similarities between languages are well explained by either physical characteristics of speech production or regularities in the environment of the languages origin.

So yes, language is both learned, and cultural, even if it isn't purely so.


For me, this perfectly sums up the futility of the Strong AI / AGI research project. So much of what we consider to be 'intelligence' or repercussions of intelligence are in fact language-based communications and culture - with an immense genetical legacy, bias, and quite frankly, burden. To create human-equivalent intelligence, most or indeed all of these evolutionary biases would have to be built in.

Consider a different example: constructing artificial vision. Human vision is the result of evolution, of course. It is incredibly inefficient, but evolution is blind (sorry). When we now construct computer vision, we capture an optical image and transmit it optically as long as possible, since this preserves information. The human eye does not: it sends information via neurons to the visual cortex, compromising immensely in bandwidth. That's why we need visual error-correction mechanisms in the brain, and redundancy of visual information (achieved by the eye moving rapidly many times per second, for example).

When we construct optical computer vision that achieves the a similar thing as human vision, the two have nothing in common. You can't plug in the artificial front end into the biological back end. The two systems produce literally different images, that are not comparable. The systems will not communicate with each other. We have skipped the legacy of biological evolution completely. To create an artificial system that accurately corresponds to the biological would be an immense waste.

The same goes for intelligence. Our 'wet' evolved intelligence is an entirely different picture from the project of Strong AI. They will produce very different manifestations of interacting with the world. Why would the latter ever result in the illusion of free will or the self-referentiality of personal identity, two things we assume are parts of human-level intelligence? They are like the neurological channels for conveying an optical image: hopelessly inefficient, but the ones that make sense in the light of our evolutionary legacy.

As a side note, yes we know of a time when there was no language, but it not a good representation to consider that a binary switch. We know of complex communication between other animals, and given that even current languages are in rapid flux, I think it is is fair to think of the transition from pre-language to language a continuum. Language is still arising.


I don't understand what bearing this should have on the probability of AGI succeeding. What we (should) care about is not whether the system is conscious or even human-like. What matters is that it performs well on the tasks it is assigned. I consider it highly likely that we can do better than evolution did, in less time than evolution took.


You should read Steven Pinker's The Language Instinct. He summarises Chomsky and the field's main insights into language brilliantly. One of the points is that while we see languages as extremely diverse, most of the differences are actually quite superficial and can be efficiently modelled by a small set of rules with a few key configurable settings, and that this basic structure for language is universal among all humans. We are just tuned to focus on the differences and not notice the overwhelming similarities. Great book anyway you cut it.


What's really fascinating is wild chimps use a rich form of communication that's far from an actual language. But, they can be trained use complex sign langue which they will then use to communicate with each other. So, language may have developed fairly rapidly relative to the underlying biology.


No, they can't be trained. Those studies were largely failures and the idea that they can be trained is little more than a myth.


If I understand your point correctly, Chomsky does explain this, even in the article. Like with his discussion of "linear order".


I'm surprised he brought that up, it's a weird point. Because a different part of the brain processes the constructed language it's no longer language? It's also untrue. We construct language patterns, poetry for example, where the positional number of a word or a syllable changes its meaning.

That said his point about the brain being wired to take the less computationally intensive route is a very important insight which I think extends beyond genetics and throughout the evolution of all biological processes.


Actually, his point is that our language organ takes the more computationally expensive route. Not the easier one. That's the puzzle.

I don't have his books on me, so I may misconvey this point, but IIRC, he also mentioned regular languages (you know, like with regexes) as another example of a computationally "easier" language family we don't pick up with our language organ. We don't speak arbitrary languages. The space of languages is filtered by genetics.

He delves into this point more deeply in many places and addresses the points you raise with a precision beyond what's found in interviews.


I suppose it depends on your definition of 'expensive'. By that I mean, if your computational model is inherently serialised, sure, regexes are cheap. If fuzzy matching and rough temporal correlation between processing units turns out to be cheap, perhaps regexes are ridiculous extravagance on your hardware family.

I suppose you could turn it around - assuming our language processing is optimal (bit of a leap) you can infer things about our hardware architecture by the languages which we parse efficiently.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: