Language is Literally Alive

People often describe languages using the verbiage assigned to living things. We call Latin a “dead language,” we say that certain linguistic developments are “healthy” or “natural,” and we describe languages on the verge of “extinction” as being “endangered.” To most people, this simply means life is a strong metaphor for language. To me, though, it’s something more. To me, language is (quite literally) a form of life.

Life is one of those words that everyone wants to define, but has a hard time being pinned down. Let’s look at the dictionary definition of the word:

life (noun)

  1. The condition that distinguishes animals and plants from inorganic matter, including the capacity for growth, reproduction, functional activity, and continual change preceding death.

It should be noted that this definition tells us a little about what life is, and a little about what it’s not. We get that life has certain processes: growth, reproduction, functional activity, continual change, and death. And we get that life is not inorganic matter.

But doesn’t that “not inorganic matter” thing seem a bit out of place? Artificially limiting? This sort of thing is a common discussion point in Artificial Intelligence circles. Much like “life,” “intelligence” has a definition that presents an ever moving goal post. The term “Strong AI,” for example, defines intelligence in direct relation to human intelligence. That is to say that artificial intelligence only becomes “true intelligence” when it is functionally identical to intelligence that we already acknowledge.

Personally, I reject that notion. It’s very possible that an artificial intelligence of equal power and utility to human intelligence could exist that bears no resemblance to anything we’ve seen before. Intelligence may be a broad concept, but as humans our experience is limited to only the intelligence of other humans and animals. The nascent intelligence of computers seems an odd and alien thing to us, and as such we refuse to acknowledge it for what it is. To justify this, we build definitions that (by design) exclude that which we are uncertain about. This is like if birds decided that planes couldn’t really fly because they don’t flap their wings.

As such, I submit another definition, drawn from the relationship between Entropy and Life:

life (noun)

  1. A self-sustaining localized reduction of entropy, characterized by its capacity for growth, reproduction, functional activity, and continual change preceding death.

This definition is one I’ve borrowed from researchers on artificial life, and it relies heavily on the concept of “entropy.” Loosely defined, entropy is the “chaos” of a system. A system with high entropy is one with many possible states, while a system with low entropy is bound to a smaller number of well-defined states.

If we look at things we normally think of as life, we can see how this definition applies. Living things tend to have highly ordered states (i.e. low entropy). The molecules are arranged in specific structures like cells, and they have internal processes that sustain those states. A living animal is not going to spontaneously transform into a dissociated conglomeration of water and organic molecules…at least, not until it dies.

So, now that we’ve gone through the work of defining what life is, how does this apply to language? Language is not rearranging molecules into orderly states, like we might expect from organic life. But there’s more than just chemical entropy out there. In fact, pretty much any set of things with multiple states can express some form of entropy. Information itself is such a system.

Consider the human brain as being awash with disordered information. We have neurons firing from our eyes, ears, mouth, and every nerve fiber in our body. Within that flood of signals, there exists some underlying associations. There are objects that we can see, colors that we can experience, sounds that we can hear, and more. Organizing all of this information is language. Through language, we classify that a Ford Mustang is a car, a Harley Davidson is a motorcycle, and that both of these are motor vehicles. Language, with all of its words and syntax, reduces the entropy of a set of information by defining concepts and forming connections between them. Moreover, all of those functional definitions of life still apply. Languages grow, reproduce, perform functional activities, and are constantly changing (much to the chagrin of language prescriptivists everywhere).

Languages are powerful beings, and humans have a natural symbiosis with them. Humans can better manage the information they are exposed to, because they have language to help organize it. And languages use humans not only as the source of new information (which they need to grow), but also as a vector for reproduction. When children are born, we immediately try to teach them our language, and when we meet others who speak differently, we try to reach common ground and learn each other’s languages. Note: In this instance, I’m referring to each individual’s interpretation of a language as a separate (but related) “language.” That is, my version of English is different from your English. They are members of the same “species,” but separate. Kind of like how I could have a beagle, and you could have a beagle, but that doesn’t mean we have the same dog.

Moreover, languages crop up wherever there’s information that needs organizing, and they are specialized to their information environments much like how animals are adapted to their physical environments. For example, computer languages are languages (as they organize information within a computer), but where humans and human languages are naturally fuzzy and implicit, the lowest level computer languages (like assembly) are rigid and explicit. This is a direct result of the structure that computers have, and thus the environment that those languages must be adapted to. Higher level languages, like C++ or Java, have to allow humans and computers to communicate, and thus they strike a balance between the super explicit assembly, and the wildly interpretive human language (like English).

There’s a whole lot that we can learn from treating language as a form of life. One of the most important, in my mind, is that it can aid us in our search for life throughout the universe. In our search for “life as we know it,” we may be blind to life as we don’t. If we can live in the constant presence of a life form that we use every day, and not even recognize it, how can we be expected to find new forms of life that test our boundaries? Recognizing that language is alive, gives a concrete example with which to test our new and broader definition of the term. It can spark our imaginations to look for other systems that can be organized, other instances where a self-sustaining region of low entropy can grow and multiply. And maybe in the process, we can discover other new forms of life that don’t exist in space beyond us, but right here with us.

So what do you think? Is language alive, or is this just a metaphor taken a step too far? What can we learn from examining language as a living thing?

Why I’m not a prescriptivist

Last night I almost got into a heated debate with a friend over the use of the “singular they” as a gender-neutral pronoun. I’ll leave my personal opinion on that topic until the end, but suffice it to say, the argument got broken up before we could spend the entire evening fighting between descriptivist and prescriptivist philosophies on language.

Now, I love language. My obsession with language is deep and unabiding. I sometimes refer to myself as a hobbyist linguist, and I know at least enough linguistic terminology to follow the posters at the Acoustical Society of America conferences (linguistics and acoustics have a lot in common). I can still remember the computational linguistics poster that was posted in the applied mathematics department at my undergrad, where an epidemiological model was used to model language boundaries in a multi-lingual society.

And as much as I like to consider the behaviors and evolution of languages, I’m even more interested in the philosophy of linguistics. What is language? Is it invented or discovered? Do we use language or does language use us?

Personally, I subscribe to the belief that language is alive as a sort of “thought organism” that exists within and exerts order over information systems. In the same way that organic life organizes and assembles chemicals such that it can replicate itself in exact (or near exact) form, words and grammar organize information in such a way that it can be propagated and maintained. I would argue that language is older than humanity, with signs of regional language and dialects existing in other intelligent animals, and that our relationship with language is strongly symbiotic. The fact that our brains are organized for the acquisition of language reminds me of animals with glands and organs specifically designed to host helpful microorganisms, without whom the animal could not survive.

Such a view of language puts me squarely in the linguistic descriptivist camp. I approach language much like how a zoologist approaches animals, with an eye for development, evolution, and taxonomy. To say that a certain language construction is “wrong,” assuming it is still understood and unambiguous, is like saying that a subspecies of animal shouldn’t exist because it doesn’t look like its neighbors. I would never suggest that the Pyrrhuloxia shouldn’t exist because it’s different than the Northern Cardinal, but the arguments of linguistic prescriptivists sound just like that to me.

That said, I understand that language prescriptivism has its place. Many people use language as a tool, including writers, publishers, and journalists. If languages are animals, then the language wielded by these groups are like purpose-bred work animals. And much like how working breeds often have breed standards that the animals are expected to conform to, there are standards for language codified by groups like the MLA.

Carrying the animal breeding analogy further, breed standards are usually a combination of necessary and arbitrary rules which differentiate breeds. Necessary in this context referring to those traits important to the animal’s task (like how pointer dogs should have the “pointing” behavior while hunting), while arbitrary relates to traits like coloration, which are not relevant to the animal’s purpose. Likewise, some linguistic rules help to reduce ambiguity in specific situations, and others establish style. Thinking of purposeful rules, I’m reminded of the precision/accuracy distinction, where colloquial english uses them interchangeably, but scientific jargon enforces a clear distinction. Considering stylistic rules, the “split infinitive” construction possesses no syntactic ambiguity but was considered incorrect for a long time simply because of tradition.

This brings me back to the topic at hand, the “singular they.” Keeping in mind that prescriptivism has a purpose, I still feel that the singular they should be adopted by style guides for one simple reason:  keeping up with the times. Perhaps previous generations could tolerate a “generic he,” because the assumption of masculinity as the default would not be challenged. But I feel that in a modern world that is more inclusive of everyone in our society, adopting a gender neutral pronoun should be an imperative for advancing social justice. And considering how difficult it is to adopt new pronouns in a non-pro-drop language, it’s probably only realistic to use the gender-neutral pronoun that’s been in use for 700 years already: the singular they.

Moreover, from a personal standpoint, I find both the “generic he” and the “generic she” to be syntactically confusing, because they seem to suggest a determinant person. Whenever I encounter either construction in writing, I inevitably scan back through the paragraph to find what subject is being referred to, only to find that it’s being used generically.

A problem that’s been observed with linguistic prescriptivism is that it can often be too conservative and ultimately obstruct natural evolution in a language. I think the wider use of the singular they is such an evolution, and one that’s predicated on society’s pursuit of greater social justice. With that in mind, we should allow such progress to continue and relax those rules that obstruct it.

Misconceptions about science: The Hierarchy

In recent years, the definition of science has become an absolute jumbled mess. Scientists and non-scientists alike maintain that science needs to have a more central role in political decision-making. Meanwhile, scientists themselves argue over what constitutes science, either in an attempt to secure grant money for their own projects or to reform science education in a way they feel is more conducive to deep scientific understanding. To top that off, historic events like the building of the atomic bomb, the moon landing, and sequencing the human genome have progressively reformed the public’s understanding of what constitutes science and scientific progress. In the process, however, I think two major misconceptions about physics have creeped into the public consciousness. The first is that all good physics is either on the very very tiny (quantum and particle physics) or the extremely massive (astrophysics) scale. The second is that the scientific method is little more than simple hypothesis testing. Both of these misconceptions can have major repercussions for science as a whole, and it’s important that scientists and non-scientists alike come to appreciate the truth.

Ever since the first atomic bomb fell back in World War II, science (and especially physics) has been inextricably linked to the atom. Immediately after the war, governments all over the world stepped up their funding of research, particularly focusing on learning more and more about the fundamental particles that make up atoms. Since that time, the “indivisible” atom has been found to consist of smaller and smaller particles, from quarks and gluons to the Higgs Boson of recent fame. On the other end of the spectrum, ever since Sputnik first went into orbit, there’s been an obsession with space and the universe beyond our tiny little Earth. Recent advances in optics and telescopes have allowed us to observe things much further and much smaller than we ever could before, and the sales of books like “A Brief History of Time” show that the public is very interested in what might be out there. Given their visibility, one might assume that the frontiers of science lie either in the very big or the very small, and that everything in between has been thoroughly explored by every scientist since Newton.

There’s arisen a sort of hegemony in science, based on an assumed hierarchy of science, going from small to large. One might say that Biology is just applied Chemistry, which is itself just applied Many-Body Physics, which is little more than an extension of Particle Physics. By this logic, those who work in Particle Physics, discovering ever smaller particles, are the ones learning the real fundamental laws of the universe. Everything else trickles down from the understanding of particles.

But as a scientist whose research in Acoustics is decidedly on the order of human observation, I must say that’s not the case. In fact, there are many phenomena that we experience on a day-to-day basis that science can’t explain directly from particles physics. For example, turbulence isn’t well understood and is widely considered one of the open problems in physics. Yet, we hear turbulence when we drive a car with the windows open, and we feel turbulence when we’re flying in a plane. Despite seeing and hearing turbulence all the time, aside from using computational models to simulate it, we have no way to describe the underlying statistics of this phenomena in fluids. Richard Feynman, one of the darlings of science and a man with colossal physical and mathematical intuition, even tried to tackle this problem multiple times throughout his career to no avail. This is a man who, at the time, probably understood the inner workings of the atom better than anyone else on Earth.

So, is this a failing of Particle Physics? Not really. While the hierarchy of science does exist, there’s a hidden step between the levels that comes in the form of “complexity.” Anderson’s famous paper, “More is Different”, does a better job explaining this than I ever could. But the gist of it is this: if you have a bunch of objects, each working according to simple rules, the behavior of those objects as a population will follow rules that don’t necessarily resemble those of the objects as individuals. As an example, the simple behavior of a single ant (following chemical trails to food, picking up dirt and moving it, etc) doesn’t resemble the resulting ant colony, despite the former leading directly to the latter. An individual ant doesn’t have a rule that says that “You build a room in the colony for maturing larvae to grow in,” but the behaviors they do have still lead to this result.

With that in mind, it’s easy to see why particle physicists aren’t the ultimate scientists. By only understanding the lowest level of science, they can only really describe their level and perhaps the level directly above them. This is the case throughout the hierarchy. Molecular biologists need to understand chemistry to innovate within molecular biology, and they can offer insight into cell biology by understanding the level that sits directly below that. As an acoustician, I rely on solid state physics, fluid dynamics, chemistry, and thermodynamics to understand sound. Yet my understanding of acoustics does not give me mastery over linguistics, which relies on a combination of acoustics, physiology, psychology, and information theory.

Each sub-discipline of science has its own rules, derived from other disciplines, yet still unique. Understanding of these rules is the true goal of science, and every bit we learn can vastly improve the quality of life in both large and small ways. Hopefully I have shown you that there are still frontiers beyond particle physics and astrophysics. Next time, I’ll be talking about how biology and human genome have changed the face of scientific understanding and convinced people that all the scientific method can give us are yes/no answers.