After reading the discussions about the supposed role of recursion in Chomskyan linguistics, both in journals (see previous post) and on Norbert Hornstein's blog, my first thought was that if I see another linguist arguing about recursion I'm going to throw up. And yet, after thinking it over, I now see fit to add my own little tid-bit to the mix.
Lobina argues, if I may paraphrase, that the stated or implied reasons for recursion in Chomksy's formalisms are vacuous, because supporters say things like "recursion is needed for a grammar to generate an infinite language" and things like that. Lobina correctly points out that this is not in fact true, so a lot of these stated reasons for recursion in linguistic theory turn out to be moot.
On thinking it over, I remembered that I myself had a need for recursion in past work. In my paper of 2010 (erratum published 2011), I demonstrated that a certain kind of recursion in the structural design of sentences was necessary to have a class of infinite (tree-structured) languages that is learnable from finite data. Now on reflection in the context of all this recursion talk, I believe that this may actually capture something that was sort of meant by Chomsky et sequitur over the years. Recursion in syntax is not needed to generate the infinite capacity of language; rather, the recursion is needed to provide learnability of the infinite from only finite data. This is, at last, a property of the recursive structures that cannot be replicated using iterative or other methods.