It is often thought that possible human languages could be context-sensitive, referring to the Chomsky hierarchy. Many frameworks of formal grammar have converged on the restricted subclass of the "mildly" context-sensitive languages as a useful compromise, because the context-sensitive class itself presents problems for parsing complexity. Mildly context-sensitive languages are, by definition, parsable in polynomial time. A referee just pointed out to me something I should have remembered, which is that the universal recognition problem for the full context-sensitive class is PSPACE-complete (hugely intractable, in other words), and that Garey and Johnson's book (1979) presented an example of a single context-sensitive language that was PSPACE-complete. This means that it is less likely that a human language could be drawn from the class of context-sensitive languages beyond the "mild" ones, from a cognitive perspective.
On the other hand, I believe that the brain has quick solutions to intractable problems. More on that in a later post.
Wednesday, September 22, 2010
Saturday, September 18, 2010
Compositionality without type theory
The principle of compositionality is often held up as fundamental to natural language semantics. This principle has been discussed and fine-tuned in plenty of papers in the literature (including one of my own, see Fulop and Keenan 2002 in the book Semantics edited by Hamm and Zimmermann). In a rough form, it is usually stated something like "the meaning of an expression is a function of the meanings of its parts and the way they are syntactically combined." This gets technically messy when one has to specify what a "meaning" is supposed to be, but there is pretty much a consensus view that some form of this principle is behind the functioning and learnability of natural language. In a language completely devoid of compositionality, any given sentence could take on any arbitrary meaning, so that "I painted a picture" might mean "the moon is a green cheese."
I guess that most linguists working on formal semantics use some kind of type-theoretic system, but the importance of type theory is much more questionable. Some authors dislike type theory as a semantic model. An interesting paper by Erdelyi-Szabo, Kalman, and Kurucz "Towards a natural language semantics without functors and operands" was published in the Journal of Logic, Language and Information (2008). This paper proposes a new scheme of formal semantics that is compositional but does not employ the typical type-theoretic system wherein a verb is viewed as a "functor" requiring an argument, for example. In their view, "compositionality presupposes that we are able to compare arbitrary meanings in terms of information content." They sketch a formal system in which the basic entities are "pieces of evidence" which might engender a belief in something. The formal language involves just a single binary non-associative construction operator used for creating structured complex terms representing pretty much anything. Maybe such a scheme should be seriously considered, given the many difficulties with the "idealization" of semantics as type-theoretic in nature.
But we have to keep it compositional.
I guess that most linguists working on formal semantics use some kind of type-theoretic system, but the importance of type theory is much more questionable. Some authors dislike type theory as a semantic model. An interesting paper by Erdelyi-Szabo, Kalman, and Kurucz "Towards a natural language semantics without functors and operands" was published in the Journal of Logic, Language and Information (2008). This paper proposes a new scheme of formal semantics that is compositional but does not employ the typical type-theoretic system wherein a verb is viewed as a "functor" requiring an argument, for example. In their view, "compositionality presupposes that we are able to compare arbitrary meanings in terms of information content." They sketch a formal system in which the basic entities are "pieces of evidence" which might engender a belief in something. The formal language involves just a single binary non-associative construction operator used for creating structured complex terms representing pretty much anything. Maybe such a scheme should be seriously considered, given the many difficulties with the "idealization" of semantics as type-theoretic in nature.
But we have to keep it compositional.
Wednesday, September 8, 2010
Fixing intensional logic
One of my favorite hobbies is reading about intensional logic. This refers to any form of logic that allows us to refer somehow to the "sense" of a logical entity, in addition to the extension. I have never contributed any of my own research to this topic, but I believe that we can never have a useful computational theory of natural semantics without some kind of decent intensional logic.
It seems like every proposed form of intensional logic through the years has got something wrong with it. Quite a bit of recent literature is still devoted to the seemingly endless task of "fixing" some variety of intensional logic.
As far as I can see, there are two dichotomies that can help classify various approaches to intensional logic. You can do things with type theory, or without it. You can do things with possible worlds, or without them. I have already posted about my dislike for possible worlds. In fact, I doubt there is a possible world in which I like them! But kidding aside, for a working intensional logic I favor Church's formalization of Frege, the Logic of Sense and Denotation---type theory, but no worlds. This logic was published in several incarnations over many decades. In fact, Church's own publications on the subject span from 1946 to 1993! This might qualify as the longest period spent by one scholar publishing papers on something that remained unsatisfactory, even to the scholar himself. I certainly hope to spend 47 years publishing, pretty well anything, if you want to know the truth.
But there is hope for Church's method, apparently. Quite a few scholars followed up on Church's flawed logic in the past decade (including some leading names like Terence Parsons), and the most recent entry in the game to fix Church is a paper by Kevin Klement in the June issue of the Bulletin of Symbolic Logic. I'll fill in the details in another post in the near future, but I direct your attention to Klement's fine paper in the meantime.
It seems like every proposed form of intensional logic through the years has got something wrong with it. Quite a bit of recent literature is still devoted to the seemingly endless task of "fixing" some variety of intensional logic.
As far as I can see, there are two dichotomies that can help classify various approaches to intensional logic. You can do things with type theory, or without it. You can do things with possible worlds, or without them. I have already posted about my dislike for possible worlds. In fact, I doubt there is a possible world in which I like them! But kidding aside, for a working intensional logic I favor Church's formalization of Frege, the Logic of Sense and Denotation---type theory, but no worlds. This logic was published in several incarnations over many decades. In fact, Church's own publications on the subject span from 1946 to 1993! This might qualify as the longest period spent by one scholar publishing papers on something that remained unsatisfactory, even to the scholar himself. I certainly hope to spend 47 years publishing, pretty well anything, if you want to know the truth.
But there is hope for Church's method, apparently. Quite a few scholars followed up on Church's flawed logic in the past decade (including some leading names like Terence Parsons), and the most recent entry in the game to fix Church is a paper by Kevin Klement in the June issue of the Bulletin of Symbolic Logic. I'll fill in the details in another post in the near future, but I direct your attention to Klement's fine paper in the meantime.
Subscribe to:
Posts (Atom)