Saturday, January 6, 2018

8a. Pinker, S. & Bloom, P. (1990). Natural language and natural selection

Pinker, S. & Bloom, P. (1990). Natural language and natural selectionBehavioral and Brain Sciences13(4): 707-784. 

Many people have argued that the evolution of the human language faculty cannot be explained by Darwinian natural selection. Chomsky and Gould have suggested that language may have evolved as the by‐product of selection for other abilities or as a consequence of as‐yet unknown laws of growth and form. Others have argued that a biological specialization for grammar is incompatible with every tenet of Darwinian theory ‐‐ that it shows no genetic variation, could not exist in any intermediate forms, confers no selective advantage, and would require more evolutionary time and genomic space than is available. We examine these arguments and show that they depend on inaccurate assumptions about biology or language or both. Evolutionary theory offers clear criteria for when a trait should be attributed to natural selection: complex design for some function, and the absence of alternative processes capable of explaining such complexity. Human language meets this criterion: grammar is a complex mechanism tailored to the transmission of propositional structures through a serial interface. Autonomous and arbitrary grammatical phenomena have been offered as counterexamples to the position that language is an adaptation, but this reasoning is unsound: communication protocols depend on arbitrary conventions that are adaptive as long as they are shared. Consequently, language acquisition in the child should systematically differ from language evolution in the species and attempts to analogize them are misleading. Reviewing other arguments and data, we conclude that there is every reason to believe that a specialization for grammar evolved by a conventional neo‐Darwinian process.

Tomasello, M., & Call, J. (2018). Thirty years of great ape gestures. Animal Cognition, 1-9.

Graham, Kirsty E; Catherine Hobaiter, James Ounsley, Takeshi Furuichi, Richard W. Byrne (2018) Bonobo and chimpanzee gestures overlap extensively in meaning. PLoS Biology





74 comments:

  1. With the appearance of language in the human species, there’s a sort of giant leap in functionality for what humans can do. Because it’s not just a system of communicating or of making propositional statements, but a system for making any *possible* propositional statement; including the set of all future statements that might be made and have never yet been made. It seems as though language has a feature of universality in this sense (comparable to the universality of turing machines maybe?). I’m skeptical that this jump was achieved for the specific adaptive purpose of “making propositional statements”. And besides, what would the “tiny selective advantages” throughout the evolution of language be/look like?

    ReplyDelete
    Replies
    1. I think you’re right by pointing out the similarities in universality (if by universality you mean anything that can be said can be expressed with any language). Turing machines use computation (were you referring to universality in the strong C-T thesis?), which is a subset of language (although computation is strictly shape based whereas there are analog/physical/sensorimotor processes are associated with language, hence “subset”). I don’t think the jump was achieved for the purpose of making propositional statements for its own sake. I believe the adaptive purpose of making propositional statements is described in the Latent Structure of Dictionaries reading (and the Natural Language and Natural Selection one too), in which language helps us not have to learn things the hard way (ex. eating the poison toadstool) by making it possible to learn new categories by word of mouth (ex. don’t eat that, it’ll make you sick!). In terms of selective advantages, I think you can see how it would be advantageous to gain information from multiple sources (and relay that information onwards) without having to brute force your way into acquiring every piece of knowledge. It would save you a lot of time, and you can go on to combine that knowledge to gain more complex knowledge. You CAN learn some things for yourself, but the advantage of language is that you don’t HAVE to. Although I think it was suggested that there’s an interplay between sensorimotor learning/limitations of language (sensorimotor induction vs. instruction from the Latent Structure paper)
      From the paper: “Children can learn from a parent that a food is poisonous or a particular animal is dangerous; they do not have to observe or experience this by themselves.”

      Delete
    2. I think you bring up a really good point here, a lot of times when explaining the evolution of language theories seem to jump rather quickly from incredibly simplistic communication to the sophisticated language that we use today. Although an unsatisfying answer I think a large part of this has to do with the evolutionary theories and their lack of ability to explain certain phenomena. Since they will never be able to access the minds and experiences that those individuals had it will be incredibly difficult to come up with a more sophisticated theory of how language evolved. In my opinion the most promising way to come up with an evolutionary theory would come after an understanding of modern language functions and then work from there.

      Delete
    3. Yes, both evolutionary and historic explanation have the handicap of not being able to manipulate, control, replicate, or do direct observations. Sometimes you can do internal reliability checks by splitting the data; and sometimes computer simulation can help test alternative hypotheses (if you encode enough of the relevant context),

      But post-hoc retrodiction is always vulnerable to the possibity that it is just an ad-hoc "Just-So Story" (as we saw last week).

      (Good integration, Christina!)

      Delete
  2. I find it very interesting how the authors in this paper talk about the differences underlying the evolution of language and the acquisition of language. The most interesting aspect of language evolution for me is the underdetermined nature of it. There are logically so many ways for a language to potentially be structured but over time the creation of languages has narrowed down this underdetermined phenomenon. Acquisition of language does not benefit from the underdetermined process by which a language originally evolved, a child receives input and eventually produces output. Despite being in the process of learning a language a child doesn’t produce some logically possible errors when they begin to speak. The speech that children hear contains little if any negative instances and is impoverished in nature. Chomsky called this the poverty of the stimulus and in essence it provides some evidence that UG must be important in language learning. At the same time, though, children make errors in their speech that they will never hear in actual communication such as overgeneralizing verb tenses. I think that this is a very interesting dichotomy on the one hand a child is great at not producing certain feasible errors and on the other hand perseverates in overgeneralizations. Why do children make errors for some grammatical elements and not for others? What makes some grammatical elements more immune to mistakes over other? Could it be because these elements are parameters in UG? At some point, though, the parameter would not be set yet and wouldn’t we then expect such errors to occur?

    ReplyDelete
    Replies
    1. As far as perseveration in verb tense, I know that children initially overgeneralize due to not receiving enough exposure to irregular verb forms (ex: 'foots' instead of 'feet'). However, once they receive enough experience with the irregular word form, they stop overgeneralizing. I think the reason for some grammatical mistakes is just because of the complexities of the language and the time it takes to learn those complexities (such as verb tenses, different allophonic sounds, etc.). Rules that can be applied without fail are rules that will be less likely to be violated, and I believe that is where the concept of universal grammar comes in.

      Delete
    2. You bring up an interesting point with regards to child language acquisition that I'm not sure that I entirely agree with. The majority of the mistakes that children make in early language production tend to be quite systematic in nature and consistent across the language, such as epenthesis (inserting another phonological sound to make the word easier to pronounce, ex: pronouncing "blue" as "buh-lue") or fronting (when a consonant sound usually produced at the back of the mouth like /k/ or /g/ is produced at the front of the mouth instead as /t/ or /d/, ex: "cat" pronounced as "tat"). While children do make mistakes that are grammatical, rather than phonological, in nature these kinds mistakes are usually resolved quickly as they are usually seen in the one-word or two-word stage of language development. Once children are producing full sentences they tend to not make as many grammatical errors.

      Delete
    3. OG and UG

      This week is about language evolution: we'll do Chomsky and Universal Grammar next week.

      But, since we have no secrets: It's important to remember the distinction between Ordinary Grammar (OG) and Universal Grammar(UG).

      OG is just rules that have been adopted across time, by imitation and convention; OG is learnable (by induction [supervised and unsupervised learning] or instruction), children make plenty of OG errors, and get corrected; and OG changes with time.

      UG is rules that all languages follow, UG does not change; UG is unlearnable (by the child), because the child does not get negative (UG-violating) examples (UG errors don't occur, so don't get corrected; this is the "poverty of the stimulus"). Therefore the child ust "know" UG already, innately.

      Hence, how (and why) UG evolved is a problem for evolutionary theory -- and notice that Pinker & Bloom (who should know better: especially Pinker) make no mention of it in their paper.

      Delete
  3. "Language learning is not programming: parents provide their children with sentences of English, not rules of English."

    This is true of first-language acquisition, which is learned unsupervised through exposure (induction) to the sentences of English (or whatever L1) that surround them. I think the same can be said for L2 acquisition for a child as long as it is EARLY acquisition - such children become bilingual by learning both languages simultaneously through exposure. But once we get to the topic of second-language acquisition at an older age, it's usually taught by giving the rules in a classroom setting - thus, analogous to supervised learning (instruction). So if we say that something that is learned through induction is not programming, then can rule-based L2 (or L3, and so on) acquisition can be considered programming?

    ReplyDelete
    Replies
    1. Although acquisition of a second language is typically learnt through explicit and formal instruction, I don't think it necessarily follows that it's programming. One reason being that no language is strictly syntax. I think the author's use of "programming" here was a bit careless because we know the formal definition of programming for computation, which is strictly syntactical. In my opinion, I think learning acquisition of another language that isn't your native one(s) --implying that this doesn't necessarily apply to bilingual children--is always through various degrees of explicit and implicit learning. We know that the easiest and fastest way to learn a second language is via immersion. I believe this to be true because of symbol grounding.

      Delete
    2. I meant that I'm not referring to bilingual children. Its clear that children learn with explicit/formal instruction and mere exposure! Sorry for the confusion.

      Delete
    3. Teaching rules in a classroom setting may be how late L2 is taught, but applying a set of rules to translate from your L1 to another language is not what it means to speak a second language. It may be enough to pass the tests in a classroom setting, but someone who is just applying rules would not really be understanding the language. L2 acquisition still involves discovering the right parameters in the grammar of the target language, so if we're not considering L1 acquisition programming, we probably shouldn't consider l2 acquisition as it either.

      Delete
    4. Devona, language (apart from UG) can always be learned either by induction (unsupervised plus supervised) or by induction (as Amber notes). It just gets harder (and takes much longer) to do it by induction for later languages, especially after the "critical period."

      Oscar, I agree that as long as we are speaking a later language by applying explicit formal rules for transforming L1 <--> L2, one is hardly really speaking L2. It's more like factoring quadratic equations by formal rule. But even if you learn it through formal rules, if and when you actually start using it for real, the rules become automated much the way they do with implicit induction. (This transition from "controlled" to "automatic" processing occurs with many kinds of expertise whether learned explicitly or implicitly.)

      Delete
  4. Although Pinker and Bloom make a good argument about the language being a product of natural selection, I am left unsatisfied with their response to how Universal Grammar could have been evolved.

    Also, contrary to what Lieberman implies, there does exist variation in grammatical ability. Within the range that we would call "normal" we all know some individuals who habitually use tangled syntax and others who speak with elegance, some who are linguistically creative and others who lean on clichés, some who are fastidious conformists and others who bend and stretch the language in various ways. At least some of this variation is probably related to the strength or accessibility of different grammatical subsystems, and at least some, we suspect, is genetic, the kind of thing that would be shared by identical twins reared apart

    Even if there are individuals who talk in ways that are not the most conventional (i.e. with "tangled syntax" and with "elegance"), we still can judge these utterances as well-formed. This judgment of whether sentences are well-formed based on syntactic rules that are governed by UG is not affected by these nuances in how people produce these sentences. The problem that UG poses to the evolution of language is that it is not learned due to the lack of negative evidence. Pinker and Bloom don't address this, and instead focus their attention of the variation of production of language and use that as proof of different grammatical subsystems.

    I don't believe that the variation is a result of different grammatical subsystems. Instead, I think that the input each speaker receives is different, and this subsequently shapes their production. However, the sentences they still produce are comprehendible by others because their sentence structures are still governed by the principles of UG.

    ReplyDelete
    Replies
    1. You're right, Celine: Pinker, who got most of his fame from being an "oracle" for psychologists trying to understand Chomsky, reveals here (and elsewhere) that his grasp of the essence of Chomskian linguistics, Universal Grammar (UG) may be wobbly.

      Delete
  5. "There are many possible rationales for any form-meaning pairing, and that is exactly the problem -- different rationales can impress different speakers, or the same speakers on different occasions, to different degrees."

    I thought this was an interesting tie-in to the Funes problem we discussed earlier. Even if he was able to form categories, they would only be meaningful to him. When it comes to language it seems just being able to form categories is not enough, you must also be in agreement with those around you about what is a member of that category.

    ReplyDelete
    Replies
    1. You don't need human feedback to learn categories; you just need corrective feedback (e.g. mushrooms); and the "name" of the category need not be a word, it can be an action ("doing the right thing with the right kind of thing"). You do need feedback, however, if you are to use an arbitrary name, and share its usage (and referent) with other users. And this is more Wittgenstein (on the impossibility of private language) than Funes (who can't learn the category in the first place).

      Delete
  6. I really like Pinker and Blooms's analogy between an eye and language. Both only make sense as parts put together to fulfill a function, are incredibly complex, and it would be hard to imagine that either could result from a physical process (e.g. genetic drift or exaptation). The authors go further to describe how an eye could not evolve through leaps but instead through a gradual process. They counter the criticism of this position, that "what good would 5% of an eye be" by indicating that it would instead evolve as 5% of vision. I think it could still be pointed out that there must be some sort of physiological evolution that gives rise over time to a structure that could support even 5% of vision (5% of an eye capable of doing 5% of vision). But I think the further analogy that could be drawn to language evolution may be the slow evolution of expression, or types of lexicon. Surely this also was a gradual process, words or ways expression evolving in bits and pieces to eventually form the languages we have today that are passed down between generations. I think this is the author's point.

    ReplyDelete
    Replies
    1. The analogy originates with Chomsky, not Pinker. The analogy does not quite work for Universal Grammar (UG), however, because the same thing that makes UG unlearnable for the child makes it hard to see how it would have evolved (in the way the eye did).

      Delete
  7. I completely agree that the analogy between an eye and language is an extremely interesting and well thought out way of expressing the evolution of language. Going further, and touching on what Marcus discussed with regards to the slow evolution of language, Stevan says that language started out very minimalistic. Stevan also said that that language started off as gesturing and pointing, then miming and then the slow evolution from show to tell. This fits in nicely with Pinkers analogy of how the eye started off as gradual process as well.

    ReplyDelete
  8. "Do the cognitive mechanisms underlying language show signs of design for some function in the same way that the anatomical structures of the eye show signs of design for the purpose of vision? ... We will suggest that language show signs of design for the communication of propositional structures over a serial channel."

    I think that this approach to explaining language is certainly interesting but it leads me to ask a rather odd question; if language was designed for a specific function, who or what is responsible for its design? If we assume that Universal Grammar is internal and has, as the authors contend, evolved somehow based on natural selection's criteria, spoken languages would seem to be an extension of our capacity to communicate which was originally used for something else (e.g Fodorian "Mentalese"). If language was in fact designed, then something other than natural selection is responsible for its development and we need to look beyond the individual to cultures or societies to determine language's use.

    ReplyDelete
    Replies
    1. Design is neither needed nor implied -- for language itself. Pinker is knocking down open doors here. The only exception is UG (which he ignores, for some reason...).

      Delete
  9. “If "expression" refers to the mere externalization of thoughts, in some kind of monologue or soliloquy, it is an unexplained fact that language contains mechanisms that presuppose the existence of a listener”
    It is pretty undeniable that the way that we have language today quite clearly shows that it is meant for a listener. For example, if language were something internal then verb tenses would not play as important of a rule as they do- in our own minds a verb tense does not matter since if we know what situation we are talking about we don’t have to choose the verb with a corresponding time. However if we are speaking to someone else the presence of a verb tense greatly enhances their understanding of what we are saying and allows for them to better communicate with us. Something that the authors bring up as well is the phonology and pronunciation that words have- if we were not meant to communicate with others we would not need to have incredibly similarly ways of sounding out our words. These (and other aspects of language) are very convincing when it comes to addressing whether language evolved for another person (and hence communication) and I believe it is quite clear from these things that it has.

    ReplyDelete
    Replies
    1. No question that language evolved for communication, not soliloquy. And we already know that from Wittgenstein's musings about "private language."

      But the kind of communication language evolved for is very different from the communication in other species: acquiring new categories by recombining the names of old categories into subject/predicate propositions with truth values.

      This enables you to go on and state any fact -- or any fiction (which thereby also empowers not only creative writers but also mendacious Trumps).

      Delete
  10. "arguments that language is designed for communication of propositional structures are far from
    logical truths"

    From this premise Pinker and Bloom brought up the concept of "mentalese". Stating that there existed no grammar in our internal thoughts and no physical language or language expression constructs such as syntax and phonetics in our thoughts either. However it's these thoughts that underlie our use of language, to communicate ideas we conceive. So an interesting question arises, through what mechanism does this conversion occur? While mechanisms like universal grammar seem to exist (and may have even been developed through evolutionary adaptation) to help structure our speech, the step that comes before and from what origins it emerges is still unclear.

    ReplyDelete
    Replies
    1. "Mentalese" (if it ever existed before language) could not have been symbolic or propositional as long as it was just going on in individuals' minds. Categories could have been learned, but their "names" would be actions, not words, which are arbitrary symbols, and would need interpersonal feedback to ground them.

      Delete
  11. Re: Section 5.3.4. Social use of language and evolutionary acceleration

    This section of Pinker and Bloom’s paper discusses the natural progression of language acquisition due to evolutionary processes and forces. Early on in the chronological history of humankind, cooperation was crucial in subsisting and encouraging communication, safety, nurturance, hunting and gathering for food, and reproduction. Specifically in a hunter-gatherer organization of society, linguistic expression in its very basic form facilitated connection and understanding between individuals of that society rather than forcing an individualistic mode of operating. This allowed for mutual understanding between individuals who occupied different roles within a society, such as hunters and gatherers, who could use communication to facilitate sharing of goods for mutual benefit. This is a form of a social contract: “if you take a benefit then you must pay a cost.” Communication is vital in this situation because it is imperative that both parties understand the agreement to which they are a part of right down to the nitty gritty semantics (as described by Pinker and Bloom). These kinds of interactions continue to push for the progression of human language.

    ReplyDelete
    Replies
    1. Communication already existed, and was important, in our species and others, before language. Vocabulary and (ordinary) can be created, learned and shared by ordinary exposure, imitation and learning (unsupervised and supervised). This does not explain the origin of language. And UG is another matter (and a hard one).

      Delete
    2. I agree with P&B that language was/is very important to cooperation and increases chances of survival. In our hunter gatherer past, cooperation was key to group and individual life. "Human language is more more powerful than one can account for in terms of selective fitness". It opens up an almost infinite number of possibilities. The idea that very small selective advantages are sufficient for evolutionary change to take place, supports the claim that human language is an evolutionary adaptation, for the advantages that language brings are undeniable. Yet, I also think we, as humans, could survive, without language. Language is a an extremely powerful tool and aid, but it is not crucial to the survival of the species.

      If, according to the authors, thought is not language, then these non-verbal humans could still potentially be able to communicate with each other with signs and other modes of communication. However, all the mythical, fictional glue that holds our societies together would be non existing. We would arguably not be able to live in groups bigger than 100- 150 individuals. Therefore, I think it's clear, as the P&B show, that sociality has had major implications on the evolution of language and vice versa.

      However, none of this explains to us why humans, rather than some other species, had the need for complex interactions? What made us "special" ?
      Do apes not require grammar, are they able to translate everything they need to from their environnement, with gestures and some sounds ?

      While clearly, language offers evolutionary advantages, the authors do not explain how or why grammar developed as such.
      We are left with the question of how do the various social uses of language offer any real explanation for the evolution of GRAMMAR ?




      Delete
    3. Marianne, you pose several interesting questions. After reading the article, I too felt Pinker and Bloom failed to address why humans, and yet no other social creatures (namely apes), developed language. The article attributes the development of language in humans to the demands of the complex collaborative and cooperative lifestyle they fell into. As the complexity of the dynamics of social cooperation among humans heightened, a corresponding form of linguistic expression to sustain these complex interactions became necessary. For instance, in social contracts it began to “make[s] a difference whether you understand me as saying that if you give me some of your fruit I will share meat that I will get, or that you should give me some fruit because I shared meat that I got, or that if you don't give me some fruit I will take back the meat that I got.”. And hence, “they could no more live with a Me-Tarzan-you-Jane level of grammar than we could.”
      With regard to your second question, “How do the various social uses of language offer any real explanation for the evolution of GRAMMAR?”, I believe that Pinker and Bloom’s ‘fruit and meat’ example provides a preliminary account for the evolution of grammar, as the syntax and grammar here dictates subtle differences in meaning that became important – having grammar became socially advantageous as it enabled them to communicate these differential messages (‘Me-Tarzan-you-Jane’ or similarly ‘You-fruit-me-meat’ was no longer sufficient). However, this account does not provide an explanation as to why grammar evolved in instances where grammar does not affect meaning. Turning to Prof. Harnad’s example of Universal Grammar (UG), ‘who did he think went out’ and ‘who did he think that he went out’, the former is deemed grammatically correct, and the latter not (and there is no explicit rule for this – it is governed by UG). Moreover, both sentences convey the same message to the receiver and thus there is no socially advantageous function to saying the first rather than the second. If Pinker and Bloom are arguing here that grammar evolved because ‘Me-Tarzan-you-Jane’ grammar was too limited to sustain such complex human interactions, then why did a grammar evolve such that ‘who did he think went out’ and ‘who did he think that went out’ are not both acceptable?

      Delete
    4. Without language a primate species could survive, but that species would not be humans (any more than a flying reptile without feathers would be a bird). And of course then none of these matters (or any matters) would be under discussion...

      What's unexplained is not the evolution of grammar, but of universal grammar (UG).

      But, yes, even without the UG question the account of the origin and evolution of language is still sketchy.

      Delete
  12. It is interesting that language has the power to further group cohesion while simultaneously, when in the wrong hands, can be used as a powerful tool for deceit and maliciousness. Perhaps this is an interesting perspective to approach the question of how language developed. Characteristics that are shaped by evolutionary forces are exclusively advantageous and since language can facilitate both good and evil, maybe there is another shaping force at work. This is counter to Pinker and Bloom, whose perspective I agree with. Further, I understand that this is probably rather far-fetched evidence for discounting evolutionary forces in language development, but nonetheless interesting. There is no doubt that the development of complex cognition as a whole was the result of evolution, with the frequency of interactions between humans becoming much more prevalent. But whether evolution was behind language specifically could be debated, as it has been.

    ReplyDelete
    Replies
    1. Before you can use language to lie and deceive, you first need to evolve (or twig on) the power of T/F propositions.

      It is obvious that language was a hybrid venture. Some Darwinian evolution, some Baldwinian evolution, and a lot of learning and convention ("cultural" evolution) once it began.

      Delete
    2. I also find this idea of evil use of language interesting because as I was reading the article I was thinking about Trump and his poor use of language. If you listen to one of his speeches his syntax is very "tangled" and the composition is far from "elegant", but we are still able to discern the meaning. Further, one could argue that his ridiculous rhetoric and way of speaking is a part of what lead him to gain so much power. Which goes to the point of being able to judge ridiculous sentence structure due to the fact that somewhere underneath it all, it is still well formed by the laws of UG. Which a lot of people have discussed as a flaw in the article for being ignored.

      However to your point about natural selection, good and evil is not inherently beneficial for survival. Often times the "evil" are better at surviving due to their willingness to do whatever it takes for their own survival. So the use of language for deceitful and malicious endeavors, though unfortunate, has little effect on the evolutionary advantage.

      Delete
    3. Evolution's bottom line is that what is adaptive, stays, and what is not, goes. As long as the tendency to lie and cheat works, it stays. In the past, cheater strategies have proved "evolutionarily unstable." But that's a question of time. Evolution has lots of time; short-term damage is just a blip. But these days a blip can destroy the entire biosphere...

      (As to Trump's inarticulateness -- it's not just his grammar that's gappy, it's his "thinking" (cognition) and ethics. We catch his drift and fill in the gaps, but the real head-shaker is not that we figure out what he's saying (not much, and all pretty dodgy), but that so many of us sustain him despite that: both the opportunists in the GOP and the deplorables in the electorate.

      Delete
  13. "Linguistic diversity would seem to imply that grammatical
    devices are very general-purpose tools. And a general-purpose tool would surely have a very generalized structure, and thus could be a spandrel rather than an adapted machine" As I was reading this, I couldn't help to think why multiple languages existed at all and why we didn't have one universal language with one universal structure. I was happy to find that the end of this section addressed my question as one of the reasonings included that " there are multiple languages, leading to the evolution of a mechanism to learn the differences among
    them" or another explanation was that "there is a learning mechanism, leading to the development of multiple languages". I found both of these interesting as clearly the environment impact grammar rules and the design of language which is why it there is such a diversity.

    ReplyDelete
    Replies
    1. Multiple languages are probably partly cultural variation, as in other skills and practices, but perhaps also (like birdsong dialects) ways to mark territory, kinship and tribal interests. But here we are more in the area of cultural evolution rather than genetic evolution.

      Delete
  14. “Nevertheless such exaptations are still gradual and are still driven by selection; there must be an intermediate evolutionary stage at which the part can sub serve both functions (Mayr, 1982), after which the process of natural selection shapes it specifically for its current function. Indeed the very concept of exaptation is essentially similar to what Darwin called "preadaptation"
    Looking ahead to reading 8B, they discuss cognitive components that were present before language, such as “sensorimotor induction; the capacity to learn by observation and imitation; the capacity for pointing, shared attention, and mind-reading”. Could it be the case that these components were in fact not intended to be the precursors to language, but were the intermediate evolutionary stage that natural selection acted on to adapt them to their current function, language. This relates to exaptation, “whereby new uses are made of parts that were originally adapted to some other function”. If these cognitive components were not originally intended to develop into language but instead evolved into it by natural selection or some other mechanism, this would be consistent with Darwin’s preadaptation and the process of exaptation.

    ReplyDelete
  15. In section 4.1 of Pinker’s article, the link between computationalism and language caught my attention. I think this is an interesting point to the discussion of the evolution of language. I agree that “Language learning is not programming: parents provide their children with sentences of English, not rules of English.” I would even go further and say that children are not even provided grammatically correct sentences whilst they are learning a language, yet they all come to learn a language through childhood. Many theories have tried to explain such a phenomenon. The suggestion of UG is somewhat satisfying in this case, because it gives a simple answer to how we could learn from imperfect models. In addition, many examples of factors that play a role in language were mentioned in the beginning of the paper (vocal tracts, breathing rate, etc.) and give an idea of the complexity of the language mechanistic. Thus, language learning could not be a simple computationalist process, which brings us to the idea of mind as a multipupose learning system, integrating several dimensions at a time.

    ReplyDelete
    Replies
    1. I think you make a very strong point against computationalism in this response. I had never considered how universal grammar was incompatible with computationalism. We tend to think of language as something very structured, formulaic, and consistent with the architecture of programming. This mindset is certainly deepened by experiences studying a language (a second language or one’s own mother tongue) or studying linguistics because both emphasize rules, rigidity, and formality. Thus, when Pinker says that “language learning is not programming: parents provide their children with sentences of English, not rules of English” I was initially surprised. But, considering Universal Grammar, I came to completely agree with Pinker (and you). The only point you make that I disagree with is when you say that "children are not even provided grammatically correct sentences whilst they are learning a language, yet they all come to learn a language through childhood.” I think this is incorrect because grammatical mistakes often persist into adulthood. For example, the misuse of double negatives in order to assert a negative statement is very common (for example: 'I don’t have none'). Children learn this grammatical mistake from adults who have made it part of their ordinary grammar. However, I admit that my point is weak because what is incorrect English grammar can sometimes be said to be valid rules of the ordinary grammar of English dialects.

      Delete
    2. Remember not to conflate OG and UG. (Many features of OG can be learned by unsupervised learning. Some OG probably needs some supervised learning. And we know there is also instruction of OG. But none of this is true of UG.)

      Pinker is a computationalist. But what makes language not just computational is the fact that unlike in mathematics, syntax is not independent of meaning. ("Colorless green ideas sleep furiously.")

      Yes, there are plenty of OG errors, early and late, but OG is robust (and redundant) enough so we understand anyway. And flexible: persistent errors are eventually considered correct -- that's how OG (and vocabulary, and even ideas and other memes) "evolve" culturally. It's mostly just arbitrary shared conventions, like red light, green light.

      If there are UG errors, though, they are random and super-rare. UG is fixed (perhaps genetically).

      (I doubt that Chomsky is a computationalist, though. Same for Turing. They're far to vast for these simplistic pygmy isms...)

      Delete
  16. “Finally, Williams (1966) suggests that convergent evolution, resemblance to man-made artifacts, and direct assessments of engineering efficiency are good sources of evidence for adaptation. Of course in the case of human language these tests are difficult in practice: significant convergent evolution has not occurred, no one has ever invented a system that duplicates its function (except for systems that are obviously parasitic on natural languages such as Esperanto or signed English), and most forms of experimental intervention would be unethical. Nonetheless, some tests are possible in principle, and this is enough to refute reflexive accusations of circularity.”

    While it is true that Esperanto and to a certain extent most sign languages are created from already known languages, there are also examples of languages created from scratch. The deaf youth in Nicaragua created their own sign languages without input from other languages. Sure they were forced from a young age to adapt to the spoken language at large, but they didn’t know it well and especially not well enough grammatically to create a whole new language with its input. And what about what the authors call “natural languages”? Weren’t those also invented at one point in time, and does this not create circularity? Languages were created even if the mechanisms allowing for their creation were formed from evolution and if their refinement is also a result of evolution. At some point a group of people just got together and decided to name a cow “cow” and a pig “pig” without it really being because that was the best way to live but mostly because they needed to sell such animals and trying to communicate by pointing at them constantly would have been a pain. Maybe I am misinterpreting the authors’ argument here but it seems like this does not disqualify the argument for language design as a just-so story as they call it.

    ReplyDelete
    Replies
    1. All of OG and vocabulary are just invented and shared cultural conventions. UG is not.

      All of evolution and evolutionary explanation is circular ("survival of the survival traits," and susceptible to Just-So story-telling) but environments and genes are not circular, and there is a truth of the matter about what evolved, how and why; it's just that we don't always get the causality sorted out.

      "Language" is not just OG and vocabulary, which change and are human-made. There's UG too. In conflating it all, Pinker & Bloom trivialize a much deeper and more nuanced picture.

      Delete
  17. I understand that OG refers to the grammatical rules specific to each language that we can explicitly pin-point and are learnt (by induction). UG, on the other hand, refers to the innate rules that underlie all languages. OG is not fixed (i.e. an ungrammatical sentence according to OG may later be accepted as grammatical in society 100 years from now), however UG is unchangeable and constant. However, as I think about different sentences (grammatical and not), aside from Prof. Harnad’s ‘who did he think went out’ example, I am struggling to decipher which rules are constituents of OG versus UG (i.e. which rules I have learned – whether it be through supervised or unsupervised learning – and which are innately programmed)…

    ReplyDelete
    Replies
    1. If you really want to know the rules of UG, you need to study Chomskian syntax, just as you need to study algebraic topology if you want to know it. Neither can be done by introspection alone. We know language and OG, implicitly and explicitly, but we do know UG explicitly.

      (Much of OG can be learned through unsupervised learning, by the way, but that is not to be confused with UG, which is not really "learned" at all.)

      Delete
  18. Maybe we are having trouble understanding how language came to be because we have a definition of semantics that is based on human understanding. What I mean is that maybe other species have a form of language that we do not have the means to grasp. We hold a human centric bias when studying other species' behavior, and we tend to compare it to our own experiences as this is how we are able to categorize and create meaning.

    We tend to associate language with intelligence, so we tend to believe that animals are not as intelligent as humans because they aren't able to speak like us. But maybe animals have been able to evolve without language because they didn't need that capacity for them to be so great.

    ReplyDelete
    Replies
    1. Other species can communicate, and have communication codes, both iconic (analog) and arbitrary. But they do not have language, in the sense of a "universal" code that allows the formulation of propositions that can describe anything there is (rather the way computation can simulate just about everything there is).

      Language is a special capacity. Great apes, elephants and whales seem to have the requisite intelligence (and they certainly communicate), but they don't have language (even when we try to teach it to them). Why?

      Delete
  19. "For universal grammar to have evolved by Darwinian natural selection, it is not enough that it be useful in some general sense. There must have been genetic variation among individuals in their grammatical competence. There must have been a series of steps leading from no language at all to language as we now find it, each step small enough to have been produced by a random mutation or recombination, and each intermediate grammar useful to its possessor."

    I think this is an interesting point. But while the argument is definitely true for evolutionary biology as a whole, I'm inclined to wonder what exactly this partial UG would have been useful for (re: genetic selection). Additionally, could we really lump early versions of universal grammar into the same category as the full-blown thing? And furthermore, is there a way to determine at what point UG was fully established over the course of language's evolution?

    ReplyDelete
    Replies
    1. Yep, those are the questions to raise. P & B skirt them...

      Delete
    2. This comment has been removed by the author.

      Delete
    3. P & B do address how intermediate steps of language could have been useful, notably when they mention the Baldwin Effect and the reproductive advantages of better grammars. P & B respond to Geschwind’s question of how could gradual grammatical developments towards UG (ie. linguistic mutations) could have been evolutionarily advantageous to the possessor of a mutation if their compatriots did not share the same mutation?

      The ability to produce speech that follows a particular grammatical structure is not the same as being able to comprehend it; we understand some statements even if they are ungrammatical (think “colourless green ideas…”) - even for cases of non-UG - because we can infer meaning (the example P & B give about “Shakepear’s complex early Modern English” is probably inappropriate because that is comparing OGs when UG should be what’s on the table.) Likewise, linguistic mutations were still compatible to a degreee with other ‘versions’ of the same system. Surely the Darwinian genetic selection process was the natural sieve for the more effective mutations made possible by the Balwinian predisposition to understand each other’s communication. Recalling how lazy evolution is, can we assume that the linguistic mutations that have brought us to UG were only of production ability, and that the rest (ie. comprehension) is left to our cognitive capacity to handle?

      Though we can’t say for sure, it seems that in the evolutionary process, proto-UG rules may not have been sophisticated to be the propositional-statement language system we use today but certainly could have been applied to give structure to systems of communication that pre date modern language like pantomime, gesturing etc.. P & B describe how even a tiny change in selective advantage can be sufficient to outcompete another group in a relatively small series of generations. Likewise, partial UG could have been advantageous by means of increasingly effective communication (sharing of information).

      Personally I don’t think we could lump early UG in with today’s UG for similar reasons as to why we differentiate between iconism and symbolism, gesturing and speech. And is there a way to determine when UG was fully established? I guess that we won’t know since the oldest extant records of written language are as far back as we can investigate grammar. But, I’m inclined to believe that homo sapiens developed the use of propositional statements long before deciding to ever enumerate them in clay or stone.

      Delete
  20. What I can’t help feel is that this article showed limited analysis or understanding of UG because Pinker and Bloom don’t or can’t sufficiently elaborate on the evolution of it. Yes “there does exist variation in grammatical ability” but does that negate existence of UG which is common to all languages? Or if not that, does it get anywhere near to understanding how/why we have UG if it is not learnt due to poverty of the stimulus?

    Regardless of eloquence, sentences are comprehensible depending on their compliance with UG so I don’t think it’s as simple as to do with accessibility of different grammatical subsystems. This is because the entire premise of UG is that we all have access to it….

    I also think that we are not speaking L2 if we are using rules from L1 to transform and translate since I did that with French to pass tests in highschool but can’t effectively communicate verbally as many other students taking basic second language courses at school can’t. This is because we weren’t actually understanding, kind of reminiscent of Searle’s CRA in the sense that we were manipulating symbols, sure more familiar if the languages had the same symbols, but still didn’t attribute coherent comprehension of them as we weren’t grasping the second language’s grammar, rather trying to figure out how we could use our first language’s grammar to string words (symbols) in the second language.

    ReplyDelete
  21. "Darwin showed how such "organs of extreme perfection and complication" could arise from the purely physical process of natural selection."

    I'll make a case for Chomsky's perspective here based on the above quote from Darwin. Here, Darwin is reflecting on what to many appeared to be the intelligent design of the human eye. He argued that while it was an organ of "perfect and complicated" design, it could still be the product of a physical process such as evolution. Allow me some poetic license here, but is human language not another perfect and complex "organ" of humanity? Aspects of language such as universal grammar seem may seem so perfect, so handcrafted, that we may suppose that they are the work of a higher power. But can we not accept, as we do Darwin's claim about the eye, that language can also be a produce of evolution?

    "Human knowledge and reasoning, it has been argued, is couched in a "language of thought" that is distinct from external languages such as English or Japanese. The propositions in this representational medium are relational structures whose symbols pertain to people, objects, and events, the categories they belong to, their distribution in space and time, and their causal relations to one another."

    This is a point that came up in class too: the argument that we are somehow intimately connected to language because of the idea of "internal thought". I think Professor you went as far as saying that all of our thought was some derivative of language, or a form of language. I don't know that I entirely agree with this notion, and further disagree that it is a strong argument for "language design". As it says in the text, this internal language is "distinct from other languages" and that it is composed of "relational structures". Though I allow that the human mind is composed of interconnected concepts of people, places and things, I'm not sure if you can name all thinking a "language". There is of course the internal dialogue that everyone has when rehearsing a future argument, writing a paper or narrating your day, but much thought is composed of mental images and unspeaking memories. I will propose two counter-examples to further emphasis my point. of T

    The most compelling argument against a "language of thought" is all the unconscious processing that goes on in our mind. Is it not a direct contradiction to what it is to have a subconscious to believe that that subconscious "speaks" to you? And if the unconscious is in fact silent, what is the framework for its operations?

    Secondly, I present the right brain, which we know through empirical evidence does not have access to language in the majority of people. So much so that in split-brain patients, the right brain cannot verbally identify objects or words. Without these very basic language functions in half of our brain, can we really say that our brain has an "internal language"?

    ReplyDelete
    Replies
    1. 1. The problem with the eye/UG analogy is that gradualism doesn't fit UG (and UG's adaptive advantages are not clear either).

      2. On Chomsky's thoughts about constraints on thought (not on the "language of thought") read: Harnad, S (2014) Chomsky's Universe. -- L'Univers de Chomsky. À babord: Revue sociale es politique 52.

      3. There is no "subconscious." Just the causal machinery that generates our thoughts as well as our feelings.

      4. The right hemisphere is normally connected to the left...

      Delete
    2. I don't think that we can turn to our neuroanatomy in order to find out whether we have an "internal language" or not. Of course, we have structures such as Broca's area and Wernicke's area which are correlated with speech production and speech comprehension respectively, but none of these areas tell us anything about UG or our internal language.

      Delete
  22. Re: 5. The Process of Language Evolution
     
    I find it interesting, and also slightly troubling, that one of the only mentions of processes by which Universal Grammar may or may not have evolved come in an introduction paragraph. Furthermore, they are never answered in any convincing way. The authors suggest, "there must have been a series of steps leading from no language at all to language as we now find it, each step small enough to have been produced by a random mutation or recombination, and each intermediate grammar useful to its possessor". They then go on in many subsections of section 5 to address different areas of language evolution. However, while they appear to be talking about universal grammar at the beginning, I do not see how they continue to talk about universal grammar in their subsections. For example, when talking about genetic variation in response to Lieberman, they write "Within the range that we would call "normal" we all know some individuals who habitually use tangled syntax and others who speak with elegance, some who are linguistically creative and others who lean on cliches, some who are fastidious conformists and others who bend and stretch the language in various ways." This is all true, and yet this is more geared toward the inner workings OG or ordinary grammar than it is to the actual 'evolution' of the Universal Grammar that they suggest earlier.

    ReplyDelete
  23. I found the argument of language evolved because of natural selection quite fascinating. I do not believe in the comparison to the evolution of the eye though. As the eye is a biological creation, and yes it did evolve with natural selection to become what it is today, I find the comparison to the evolution of language quite arbitrary. Although you can explain a human’s capacity of language as being part of natural selection because of the survival of the fittest and the human’s capacity to survive as a being because of language, the comparison to a biological factor like the idea didn’t quite make sense to me. There are other several factors contributing to the evolution of language and I think other arguments like learning feedback and socialization are more responsible then explaining it by natural selection.

    ReplyDelete
    Replies
    1. The eye is a structure (with a function: seeing) whereas language is a function, a behavioral and cognitive capacity. But the brain is the structure that generates that behavioral and cognitive capacity. And the brain evolves. The special problem with language is about how and why UG evolved.

      Delete
  24. In this article, it is stated that "language learning is not programming: parents provide their children with sentences of English, not rules of English." Obviously, the first language that a child learns will be through hearing sentences, not learning rules. However, once a child has an understanding of one language, they can learn a second language by learning rules (presented to them in their first language). In this way it is possible to teach someone a language strictly through rules. Could we then say that though first language learning cannot be programming. second language learning is a form of programming, if taught only through rules?
    Also, I had trouble coming to terms with Chomsky's argument that language arose due to changes in cognitive capacity attributed to several adaptations forged by natural selection processes. Is Chomsky proposing that language arose as a by-product of certain traits being selected upon that changed our cognitive capacity (i.e. increased brain size)? Does this clash with Pinker and Bloom's argument that the language faculty can be explained by natural selection, or is it coherent with it? On this I am slightly confused.

    ReplyDelete
    Replies
    1. OG can be learned via supervised and/or unsupervised learning and/or explicit formal instruction, whether for 1st language or 2nd. UG is not learned at all; it is innate. (Explicit formal instruction is not "programming.")

      Chomsky does not say much about evolution or evolution of language. It is P & B that talk about those.

      Delete
  25. “Why is there more than one language at all? Here we can only offer the most tentative of speculations. For sound-meaning pairings within the lexicon, there are two considerations … Second, it may be difficult to evolve a huge innate code.”

    Besides the obvious fact that “it may be difficult to evolve a huge innate code”, I can’t understand why Pinker and Bloom come to that speculation. From what I’ve understood in this course, an innate lexical code would imply that our categories are innate. This in turn would make our learning process impossibly difficult, as a significant part of learning is forming (and expanding) categories. Therefore, if our categories are innate in the way that Pinker and Bloom suggest, learning new “sound-meaning pairings within the lexicon” would be extremely inefficient—unless they are suggesting a huge innate code in which the categories are all already named, where their names are already within this code. Why is there more than one language? Because one doesn’t form the exact same categories as everyone else, for instance. I can’t understand whether it’s me who doesn’t comprehend the point of this speculation or whether it is just irrational in itself and not worth arguing with…

    ReplyDelete
    Replies
    1. Although categorization is an important component of language, it is not all there is to language. There is also grammar (OG and UG). P & B are speculating about why grammar is not all innate. (UG is, OG isn't.) Innate grammar would not imply innate vocabulary, or innate categories. But for lazy evolution, it would be far more genetic load than necessary; and it would bee needlessly inflexible.

      Delete
  26. Chomsky has argued that we have an innate language acquisition device. In other words, we enjoy some cognitive feature that affords the ability to learn languages. He offers universal grammar as proof. We all know it, yet we never learn it, because we never hear or produce violations of it that can be corrected with feedback.

    It seems that Pinker understands this conceptually, but fails to distinguish between UG and OG. Whereas Chomsky argues that language emerged as a result of some other adaptation (i.e. brain volume), Pinker argues that language on its own must have been the result of natural selection. We’ve talked extensively about why this is wrong and in general we invoke his misunderstandings about the differences between UG and OG.

    However, I’d like to offer a simpler refutation. A basic tenet of natural selection, according to Pinker, is gradualism. In other words, even though the fossilised evidence sometimes appears to suggest otherwise, natural selection produces changes that occur in small, incremental (i.e. gradual) steps. Using Pinker’s example of the vertebrate eye, subtle physical adaptations over countless generations is conceivable. But what does gradualism in language look like?

    It’s easy to imagine how symbolic representation of objects and ideas (individual words) could develop very gradually, although that doesn’t constitute language. Rather, that represents the penultimate pre-language step. Language involves propositional statements. The leap from pure symbolism to these kinds of statements is massive, but there’s nothing in between. What does 5% of a proposition-based language look like? It’s hard to conceive of natural selection pressures that could have induced such a drastic change directly. Pinker addresses something akin to this in Section 5. of the paper:

    “No single mutation or recombination could have led to an entire universal grammar, but it could have led a parent with an n-rule grammar to have an offspring with an n+1 rule grammar, or a parent with an m-symbol rule to have an offspring with an m+1 symbol rule.”

    But all those kinds of steps presuppose the possibility of propositional sentences. He’s talking about the evolution of language once it’s already come into being. What about the jump from symbols to statements? This is never addressed in the strict natural selection account. Rather, the Chomskian explanation makes more sense. Something else about us evolved in a stepwise manner, until it reached the critical mass or flashpoint at which propositional sentences became a possibility.

    ReplyDelete
    Replies
    1. There’s one sticking point that confuses me about innate language acquisition and UG. UG seems to reflect some fundamental cognitive feature humans share. Clearly then, it is rooted in our brains and has some biological basis. Shouldn’t we then observe genetic variability in the ability to be UG-compliant? Pinker addresses something similar to this question:

      “Also, contrary to what Lieberman implies, there does exist variation in grammatical ability. Within the range that we would call "normal" we all know some individuals who habitually use tangled syntax and others who speak with elegance, some who are linguistically creative and others who lean on cliches, some who are fastidious conformists and others who bend and stretch the language in various ways. At least some of this variation is probably related to the strength or accessibility of different grammatical subsystems, and at least some, we suspect, is genetic, the kind of thing that would be shared by identical twins reared apart.”

      This, of course, is OG variability. Do we observe UG variability as well? Further, is there some kind of genetic mutation that leads a person to produce lots of UG-non-compliant sentences?

      Delete


    2. What is "symbolic representation of objects and ideas"? We recognize objects behaviorally and perceptually. Then we name them. What are "ideas"? We're waiting for cognitive science to tell us. (And what is "representation"? Who/what is representing what for whom/what?)

      But, yes, category learning capacity could evolve gradually. Referential pointing and naming might be one of the steps.

      The passage from category learning to communication by pointing and mime to arbitrary gestural naming to propositions to speech may well be gradual (if true), with a lot of cultural invention and then Baldwinian evolution favoring the tendency to learn it. The step to propositionality, even though it proved to be a nuclear weapon, does not in itself look to be too big for gradualism (in fact it's more of a mystery why it only happened in our species, since elephants, apes and whales, at least, all seem to be smart enough, and would have benefitted from it.).

      The hard part is giving a gradualist evolutionary explanation of UG:
      Harnad, S. (2008) Why and How the Problem of the Evolution of Universal Grammar (UG) is Hard. Behavioral and Brain Sciences 31: 524-525

      Chomsky does not provide an evolutionary explanation, just a speculation.

      Not all innate traits vary in individuals. Having two eyes does not (except in birth defects, which are developmental rather than evolutionary). But there is no such developmental story to be told about UG. (The cases of genetic "specific language disability" (see the work of McGill's Myrna Gopnik) affect grammatical capacity, but mostly just OG.

      Delete
  27. It is intriguing to contemplate if the cognitive ability to acquire language arises before language as the result of evolution, and language comes along later through interaction and the needs of society, or the cognitive ability arises specifically for language ability. Both guess seemed to have its flaws, if the first one is true, then are there any other species other than humans that have the same cognitive ability to have language? If there is, why they do not have language? If there is no other species, then how do humans evolve this ability uniquely? All other adaptive features are observable in other species, except language. If we have a evolutionarily adaptive special design for language, then how did it happen, since even our closest kin cannot acquire language after training.
    Chomsky argues for the innateness of human languages, by showing universal grammar, something that we all acquiesce but fail to recognize if do not deliberately reflect on it. However, if UG is innate, can it be shaped by the environment where the people grow up and result in different subconscious conventions in different languages, or it is fixed and pre-determined, and can be applied to all different languages? It would be implausible if the latter is the case since we have so many different languages and each have different ways of making propositions. One could argue that the variability exists only in OG, in which the numerous environmental impacts kick in. However, if we propose something UG in English, and translate it, say in Chinese, and it makes clear logical and syntactic sense, and it would be observed in communication. Will it make it OG in Chinese and UG in English? If so, where do we draw the line between OG and UG then? Furthermore, how about bilinguals or multilinguals who are perfectly fluent in different languages? Are their UG different in each languages, or they are the same?

    Also, Pinker discussed in his paper about the gradualism of language emergence. However, this seems to be unconvincing since the essence of language, propositional ability, seem to be something only in a all or none fashion. The attempt to find species with intermediate ability of language is also implausible. Since we can train an animal to categorize perfectly, and also make analogy between birds chirps and human languages, however, none of these examples render a clear line of development in evolution that will eventually leads to the advent of language.

    ReplyDelete
    Replies
    1. UG is universal, meaning it is the same in all languages (except for parametric variations, which are learned).

      Propositionality is all-or-none (though it could be reached gradually). The problem is that although the adaptive advantages as well as the power of propositionality are obvious, it is not at all obvious whether or why propositionality requires UG!

      Delete
  28. “Once again, recursion is far from being an “overly powerful device.”The capacity to embed propositions within other propositions, as in [He thinks that S] or [She said that [he thinks that S]], is essential to the expression of beliefs about the intentional states of others”

    I found this to be one of the most compelling rebuttals of this paper - the ability to show that our method of complex thought communcation, and the unique power it has of recursion, is necessary to convey even basic propositions. While I overall do not know if I agree with Pinker’s arguments (in particular the argument for protolanguages), acknowledging that to communicate our assumed most basic needs, the structure of language would necessarily share with our modern language in some basic way. As we discussed in class, these initial communications would have been advantageous because of the efficiency of learning by instruction - I found myself applying this when reading about the cognitive arms race & the rapidity with which selective adaptation could push language. I had never considered that the evolution of language could be modeled in such a small time frame. Because of this their argument that we may have no living species that would represent an in between of our communication capacities and those of chimps or apes to be plausible.

    These explorations do not satisfactorily explain the underlying shared grammatical rules (UG) of all languages. If the entirety of language were left to evolution, would there not be some variance in grammatical structures (and different timelines of development)? Even under the assumption that language originated with one population that then migrated across the globe, it seems highly unlikely that those populations would maintain the original structures of that first population - things get forgotten or modified. So if we are to accept a evolutionary description, we need to know how these rules could either be genetically encoded or how it would have been possible to evolve and early understanding of grammatical rules without instruction.

    ReplyDelete
  29. When reading this paper, I found Pinker/Bloom’s argument against UG (section 5) to lack strong evidence, as they tend to lose sight of the actual meaning of UG and sometimes fail to distinguish it from OG in the proper ways. However, in section 3.3, they say, “Indeed, though grammatical devices are put to different uses in different languages, the possible parings are very circumscribed. … Such universal constraints on structure and function are abundantly documented in surveys of the languages of the world.”, which is in response to the argument of language diversity challenging the natural selection point of view. This quote seems to be in line with the idea of parametric variations, i.e. the learned differences between language that are part of UG, as they are arguing that language has a universal pattern that all languages follow only in slightly different ways. This quote in comparison to their arguments against UG in section 5 (genetic variation, intermediated steps, etc.) seemed to contradict themselves, to me. Am I mistaken, and it is possible to believe that language follows Pinker/Bloom’s ideas of natural selection while retaining different universal parametric variations against languages, or is there a contradiction between their ideas?

    ReplyDelete
  30. In their paper, Pinker and Bloom seem to be discussing the evolution of OG and ignoring the evolution of UG. Because OG is learned and can change, their arguments seem sound for OG. However, UG is unlearnable: we never formulate non-UG-compliant utterances nor do we ever hear examples of non-UG-compliant utterances. Therefore, it seems very likely that UG is innate, which poses a problem for evolution that the authors do not address.

    "An ancient animal with 5 per cent of an eye might indeed have used it for something other than sight, but it seems to me at least as likely that it used it for 5 per cent vision. ... Vision that is 5 percent as good as yours or mine is very much worth having in comparison with no vision at all. So is 1 per cent vision better than total blindness. And 6 per cent is better than 5, 7 per cent better than 6, and soon up the gradual, continuous series."

    Also, the authors’ eye analogy is problematic in that while an eye can develop from a primitive to a sophisticated seeing tool over the course of evolution, it is very hard to imagine a language that can only be used to describe 5, 6, or 7% of things. Additionally, while language as a whole confers evolutionary advantages like the eye does, is UG evolutionarily advantageous? I am not sure we would be more at risk if we spoke in UG-non-compliant ways.

    ReplyDelete
  31. Further thoughts : I agree with your last point Myriam. While we could potentially imagine a language that has *names* for only a tiny percent of things, say because there is interpersonal feedback to confirm the arbitrary name given to the refereed OR because having/creating names for *all* things takes a long time when it is the first time they are being created. In that sense language, and OGs, are gradual and become probably more complex over time. It is however hard to image that a language which has propositional statements (thus which goes beyond just naming things) could only account and describe 5-7% of things because in themselves propositions are not gradual. There is no gradually evolving capacity to make propositions. Once one can make a proposition, one can do so with all names and categories. Embedded propositions and so on. Once we are able to put a subject and a predicate together, we can apply and use for all categories and kinds.

    Also, what do you mean by "i am not sure we would be more at risk if we spoke in UG-non-compliant ways ?"

    This leads me to another reflection, one on UG since I did not feel like i understood UG at all. Now I kind of get a sense that I kind of understand it conceptually as you say "UG is unlearnable: we never formulate non-UG-compliant utterances nor do we ever hear examples of non-UG-compliant utterances." From my understanding:
    In the face of negative evidence, it seems that THERE IS a universal innate grammar which 'underlies' all languages, but how do you know this ? Why a Universal Grammar rather than nothing, meaning, just OGs. Is the center of the argument for UG propositionality ? As in: once we have propositions, so a basic subject-predicate syntactic relationship, we have grammar, and the fact that this is found in all languages, despite different OGs and their complexity ?

    As to your question, is UG evolutionarily advantageous. I ask, first, if you think that UG arose/evolved at the crucial transitioning point from naming to defining ? Second, just a shot in dark, maybe... if UG is the underlying rules of all OGs, UG makes a sort of universal communication possible between speakers of different OGs. I'm thinking of thought experiments..


    There is no environmental evidence to explain how, if at all, UG evolved. Pinker and BB.. say "Noam Chomsky, the world's best-known linguist, and Stephen Jay Gould, the world's best-known evolutionary theorist, have repeatedly suggested that language may not be the product of natural selection, but a side effect of other evolutionary forces such as an increase in overall brain size and constraints of as-yet unknown laws of structure and growth." If we take this non-adaptationist explanation, what are the implications for our understanding of UG? Does it really change anything ?

    ReplyDelete

Opening Overview Video of Categorization, Communication and Consciousness

Opening Overview Video of: This should get you to the this year's introductory video (which seems to be just audio):  https://mycourses2...