While reading Steven Pinker’s Language Instinct, he poses an interesting rule about English, which is very picky about word and phrase order. One can say, “He gave the girl that he met in New York while visiting his parents for ten days around Christmas and New Year’s the candy,” but this is a “top-heavy” sentence. It butts up against the upper limit of the natural human “memory space”. Even people with high IQs would presumably have a difficult time processing sentences that nest even just 5 elements recursively:
“I love the part [where he remembered [when we were attending the wedding [that was crashed by the people [who were late after the party [which was held by the steward of the ship]]]]] in the blog post.”
By the time the listener arrives at the end, he’s likely out of memory space, and “in the blog post” might begin to be associated with other stuff in the sentence, unless of course he’s taking notes.
Languages that modify words with case markers, like Latin, can shift sections around in the sentence so the listener can digest these chunks more easily. English, however, has mostly eliminated its case markers and they only exist in vestigial forms like “me” VS “I”, “he” VS. “him”, etc.
I don’t believe this is a defect of English. This seems to be a standard feature of economizing a language. The first language would have been entirely memory-based. There would technically have been no way to take notes or record anything, unless my hypothesis about the first Oldowan rock cuttings is true and these were “recorded words”. Regardless, stone age cultures relied almost entirely upon the spoken word, and so language would have had to bear this in mind: a sentence couldn’t stretch the memory limits, or else the hearers would have forgotten or messed up the information. So, oral-heavy languages, especially if they have no written forms, have pretty complex grammar that modifies nouns, verbs and more so that the hearer knows where to assign the sentence elements in the memory space and doesn’t force them to rely entirely on word order.
English emerged during the printing press which corresponds with the increased use of gunpowder and statebuilding, so we might imagine that as English grammar and spelling solidified in the late 1800s (look at mid-1800s legalese in America and note the variety of spellings for everyday words, and you’ll be surprised at how recently it has become even semi-standardized), there was a trend toward removing declension and dative suffixes to words and relying more on word order, because we were able to “outsource” memory to the written form thanks to the printing press. Sentences in print could benefit from being more nested with less declension, but these necessarily conformed to spoken language and vice versa, producing the anomaly that is English grammar today. And that’s just the reality of technology: it allows us to outsource cognitive processes. We can make the same case for the emergence of the alphabet during the iron age: images are sublimated to text, and we get iconoclasm as in the middle east, or a severe restriction on what is deemed “good” in art as in Greece and Rome.
We can see a mechanism at work under all this: increases in the potential for apocalyptic violence incentivizes greater deferral. In other words, when the potential for violence increases (as in the case of nukes), then we will want to work harder at avoiding it. The shared moment of anticipation (what I call Unoptimized-Merge) becomes longer and longer, which stretches the memory limits of the antagonists. Nukes have pushed us to a new level of memory requirements. The only way to avoid using nukes is to outsource the memory limits to other means: magnetic tape, computers, etc. So this might mean that during stone technology, the memory could be offloaded to spoken word and drawings, iron weapons to the alphabet, gunpowder->printing press, nukes->computers. What will AI demand and where will we put it?
Why doesn’t our brain just “evolve” to add memory space? The human brain can only grow so big so quickly (and perhaps it’s only as big as it is now because it has needed such memory space… Perhaps Oldowan man couldn’t have nested more than 2 recursive statements…). If the brain were to grow any more quickly, this might have forced speciation in humans, which might explain why birds and bats speciate so rapidly (10k species of birds, 1250 species of bats last I checked). But in humans, language might be what coopts this process and prevents speciation, which is why we have 70 races of people but not 70 species.
(As a side note, I have a tin-foil-hat theory that Autistics are beginning to outsource memory requirements totally differently than Manic-Schizoids, and this might be causing their respective brains to diverge too radically to allow for reproduction. Unfortunately, the two personality types are attracted to one another, for reasons I’ll explain in the distant future. So if Autistics and Manic-Schizoids find themselves less and less capable of producing offspring together, and if this is what’s behind the decreasing fertility in western nations, then *gulp* we might be speciating…… Anyway… I’ll take the tin foil hat off now.)
With this we might draw a new Unoptimized Logic axiom: increases in the aggression kernel force an increase in the linguistic memory space requirements. Once the upper limit of the brain is reached, the memory has to be offloaded onto other media, or else violence will break out. Hence, we have ot use language to outsource the memory requirements to technology.
What’s incredible is that all people in the world have generally the same brain size, and anomalies such as people with brains of only 1000 cc are still generally capable of carrying on in modern society. Australian Aborigines were technically still in the “stone age” when they met Europeans, and suffered a catastrophic defeat due to European rifles, but they didn’t have drastically smaller brains, they were able to learn firearms quickly when aiding the European government, and I assume they didn’t have any drastically reduced recursive abilities. It might be safe to say that ~30-100K years ago everybody sort of leveled out. The only thing differentiating a Tierra Del Fuegian from Darwin was that they didn’t know how to offload memory space. As far as I know they haven’t had a difficult time doing so after being introduced to new tech. Perhaps the Fuegian language is less compatible with this tech, but their brains are compatible with English or Romance languages, which appear to fit them just fine.
How could it be that all humans have basically “achieved” the same organic memory space when they have diverged so drastically in offloading abilities? We don’t know, and I would venture to say that we have failed to even ask the question properly because we still don’t understand language: Darwinists say that it was a response to external conditions, Creationists say that it was incipient from the beginning, and everyone else sort of finds something in between. Nobody has floated the idea of language being a function of intra-specific object-based combat in humans, because Darwinists think that humans are mechanically no different from animals, and Creationists believe that humans are divinely different from animals. My goal is to show that there is a higher quality theory for language: we have a mechanical difference which produces human language and has allowed us to carpenter the world. This might be destroying us, but if we don’t do it, we’ll destroy ourselves with ROBA.