Recursive nesting still stands as universal. Pretty much every language has a form of "he said {she said { stuff }}". One guy claims to have found a language deep in the Amazon without this feature, but no one thinks he's credible.
Pretty much every language does verbs connecting subject/object. In English it's S-V-O. Treating addition as a verb, our sentences work kind of like X+Y. Other languages use something like reverse polish.
The original Chomsky idea was that there was an underlying brain structure reflected in language and only a small number of tunable parameters defined the whole space. This hypothesis was meant to explain how humans learn language so quickly. The idea was, much like horses that start galloping just after leaving the room, maybe we're born already sort of knowing it.
The theory hasn't panned out so far. There's too much similarity between unrelated languages to pretend like some universal mechanism isn't behind them, but for nearly any particular lingustic feature you can usually find at least one obscure example which shows it isn't universal. The two features I've listed are the exception.
There are 6 orders of S, V, and O. All are attested, but the orders where O precedes S are vastly less common than those where S precedes O.
The counter-argument to Chomsky's Universal Grammar is that commonalities across language exist because they are all solving similar problems for which there are more or less optimal solutions. In essence, convergent evolution produces similarity. And one might argue that recursive structure isn't a feature but rather the absence of a feature: something to prevent recursion. If you have a rule that says you can make category X out of pieces which may also be of category X, then you have recursion. So using a noun as a modifier of another noun -- "lunch counter" -- produces recursion.
Pretty much every language does verbs connecting subject/object. In English it's S-V-O. Treating addition as a verb, our sentences work kind of like X+Y. Other languages use something like reverse polish.
The original Chomsky idea was that there was an underlying brain structure reflected in language and only a small number of tunable parameters defined the whole space. This hypothesis was meant to explain how humans learn language so quickly. The idea was, much like horses that start galloping just after leaving the room, maybe we're born already sort of knowing it.
The theory hasn't panned out so far. There's too much similarity between unrelated languages to pretend like some universal mechanism isn't behind them, but for nearly any particular lingustic feature you can usually find at least one obscure example which shows it isn't universal. The two features I've listed are the exception.