Language is Literally Alive

People often describe languages using the verbiage assigned to living things. We call Latin a “dead language,” we say that certain linguistic developments are “healthy” or “natural,” and we describe languages on the verge of “extinction” as being “endangered.” To most people, this simply means life is a strong metaphor for language. To me, though, it’s something more. To me, language is (quite literally) a form of life.

Life is one of those words that everyone wants to define, but has a hard time being pinned down. Let’s look at the dictionary definition of the word:

life (noun)

  1. The condition that distinguishes animals and plants from inorganic matter, including the capacity for growth, reproduction, functional activity, and continual change preceding death.

It should be noted that this definition tells us a little about what life is, and a little about what it’s not. We get that life has certain processes: growth, reproduction, functional activity, continual change, and death. And we get that life is not inorganic matter.

But doesn’t that “not inorganic matter” thing seem a bit out of place? Artificially limiting? This sort of thing is a common discussion point in Artificial Intelligence circles. Much like “life,” “intelligence” has a definition that presents an ever moving goal post. The term “Strong AI,” for example, defines intelligence in direct relation to human intelligence. That is to say that artificial intelligence only becomes “true intelligence” when it is functionally identical to intelligence that we already acknowledge.

Personally, I reject that notion. It’s very possible that an artificial intelligence of equal power and utility to human intelligence could exist that bears no resemblance to anything we’ve seen before. Intelligence may be a broad concept, but as humans our experience is limited to only the intelligence of other humans and animals. The nascent intelligence of computers seems an odd and alien thing to us, and as such we refuse to acknowledge it for what it is. To justify this, we build definitions that (by design) exclude that which we are uncertain about. This is like if birds decided that planes couldn’t really fly because they don’t flap their wings.

As such, I submit another definition, drawn from the relationship between Entropy and Life:

life (noun)

  1. A self-sustaining localized reduction of entropy, characterized by its capacity for growth, reproduction, functional activity, and continual change preceding death.

This definition is one I’ve borrowed from researchers on artificial life, and it relies heavily on the concept of “entropy.” Loosely defined, entropy is the “chaos” of a system. A system with high entropy is one with many possible states, while a system with low entropy is bound to a smaller number of well-defined states.

If we look at things we normally think of as life, we can see how this definition applies. Living things tend to have highly ordered states (i.e. low entropy). The molecules are arranged in specific structures like cells, and they have internal processes that sustain those states. A living animal is not going to spontaneously transform into a dissociated conglomeration of water and organic molecules…at least, not until it dies.

So, now that we’ve gone through the work of defining what life is, how does this apply to language? Language is not rearranging molecules into orderly states, like we might expect from organic life. But there’s more than just chemical entropy out there. In fact, pretty much any set of things with multiple states can express some form of entropy. Information itself is such a system.

Consider the human brain as being awash with disordered information. We have neurons firing from our eyes, ears, mouth, and every nerve fiber in our body. Within that flood of signals, there exists some underlying associations. There are objects that we can see, colors that we can experience, sounds that we can hear, and more. Organizing all of this information is language. Through language, we classify that a Ford Mustang is a car, a Harley Davidson is a motorcycle, and that both of these are motor vehicles. Language, with all of its words and syntax, reduces the entropy of a set of information by defining concepts and forming connections between them. Moreover, all of those functional definitions of life still apply. Languages grow, reproduce, perform functional activities, and are constantly changing (much to the chagrin of language prescriptivists everywhere).

Languages are powerful beings, and humans have a natural symbiosis with them. Humans can better manage the information they are exposed to, because they have language to help organize it. And languages use humans not only as the source of new information (which they need to grow), but also as a vector for reproduction. When children are born, we immediately try to teach them our language, and when we meet others who speak differently, we try to reach common ground and learn each other’s languages. Note: In this instance, I’m referring to each individual’s interpretation of a language as a separate (but related) “language.” That is, my version of English is different from your English. They are members of the same “species,” but separate. Kind of like how I could have a beagle, and you could have a beagle, but that doesn’t mean we have the same dog.

Moreover, languages crop up wherever there’s information that needs organizing, and they are specialized to their information environments much like how animals are adapted to their physical environments. For example, computer languages are languages (as they organize information within a computer), but where humans and human languages are naturally fuzzy and implicit, the lowest level computer languages (like assembly) are rigid and explicit. This is a direct result of the structure that computers have, and thus the environment that those languages must be adapted to. Higher level languages, like C++ or Java, have to allow humans and computers to communicate, and thus they strike a balance between the super explicit assembly, and the wildly interpretive human language (like English).

There’s a whole lot that we can learn from treating language as a form of life. One of the most important, in my mind, is that it can aid us in our search for life throughout the universe. In our search for “life as we know it,” we may be blind to life as we don’t. If we can live in the constant presence of a life form that we use every day, and not even recognize it, how can we be expected to find new forms of life that test our boundaries? Recognizing that language is alive, gives a concrete example with which to test our new and broader definition of the term. It can spark our imaginations to look for other systems that can be organized, other instances where a self-sustaining region of low entropy can grow and multiply. And maybe in the process, we can discover other new forms of life that don’t exist in space beyond us, but right here with us.

So what do you think? Is language alive, or is this just a metaphor taken a step too far? What can we learn from examining language as a living thing?

Why I’m not a prescriptivist

Last night I almost got into a heated debate with a friend over the use of the “singular they” as a gender-neutral pronoun. I’ll leave my personal opinion on that topic until the end, but suffice it to say, the argument got broken up before we could spend the entire evening fighting between descriptivist and prescriptivist philosophies on language.

Now, I love language. My obsession with language is deep and unabiding. I sometimes refer to myself as a hobbyist linguist, and I know at least enough linguistic terminology to follow the posters at the Acoustical Society of America conferences (linguistics and acoustics have a lot in common). I can still remember the computational linguistics poster that was posted in the applied mathematics department at my undergrad, where an epidemiological model was used to model language boundaries in a multi-lingual society.

And as much as I like to consider the behaviors and evolution of languages, I’m even more interested in the philosophy of linguistics. What is language? Is it invented or discovered? Do we use language or does language use us?

Personally, I subscribe to the belief that language is alive as a sort of “thought organism” that exists within and exerts order over information systems. In the same way that organic life organizes and assembles chemicals such that it can replicate itself in exact (or near exact) form, words and grammar organize information in such a way that it can be propagated and maintained. I would argue that language is older than humanity, with signs of regional language and dialects existing in other intelligent animals, and that our relationship with language is strongly symbiotic. The fact that our brains are organized for the acquisition of language reminds me of animals with glands and organs specifically designed to host helpful microorganisms, without whom the animal could not survive.

Such a view of language puts me squarely in the linguistic descriptivist camp. I approach language much like how a zoologist approaches animals, with an eye for development, evolution, and taxonomy. To say that a certain language construction is “wrong,” assuming it is still understood and unambiguous, is like saying that a subspecies of animal shouldn’t exist because it doesn’t look like its neighbors. I would never suggest that the Pyrrhuloxia shouldn’t exist because it’s different than the Northern Cardinal, but the arguments of linguistic prescriptivists sound just like that to me.

That said, I understand that language prescriptivism has its place. Many people use language as a tool, including writers, publishers, and journalists. If languages are animals, then the language wielded by these groups are like purpose-bred work animals. And much like how working breeds often have breed standards that the animals are expected to conform to, there are standards for language codified by groups like the MLA.

Carrying the animal breeding analogy further, breed standards are usually a combination of necessary and arbitrary rules which differentiate breeds. Necessary in this context referring to those traits important to the animal’s task (like how pointer dogs should have the “pointing” behavior while hunting), while arbitrary relates to traits like coloration, which are not relevant to the animal’s purpose. Likewise, some linguistic rules help to reduce ambiguity in specific situations, and others establish style. Thinking of purposeful rules, I’m reminded of the precision/accuracy distinction, where colloquial english uses them interchangeably, but scientific jargon enforces a clear distinction. Considering stylistic rules, the “split infinitive” construction possesses no syntactic ambiguity but was considered incorrect for a long time simply because of tradition.

This brings me back to the topic at hand, the “singular they.” Keeping in mind that prescriptivism has a purpose, I still feel that the singular they should be adopted by style guides for one simple reason:  keeping up with the times. Perhaps previous generations could tolerate a “generic he,” because the assumption of masculinity as the default would not be challenged. But I feel that in a modern world that is more inclusive of everyone in our society, adopting a gender neutral pronoun should be an imperative for advancing social justice. And considering how difficult it is to adopt new pronouns in a non-pro-drop language, it’s probably only realistic to use the gender-neutral pronoun that’s been in use for 700 years already: the singular they.

Moreover, from a personal standpoint, I find both the “generic he” and the “generic she” to be syntactically confusing, because they seem to suggest a determinant person. Whenever I encounter either construction in writing, I inevitably scan back through the paragraph to find what subject is being referred to, only to find that it’s being used generically.

A problem that’s been observed with linguistic prescriptivism is that it can often be too conservative and ultimately obstruct natural evolution in a language. I think the wider use of the singular they is such an evolution, and one that’s predicated on society’s pursuit of greater social justice. With that in mind, we should allow such progress to continue and relax those rules that obstruct it.