Word of the Week: ‘Folxiness’

Almost every fan of George Orwell’s nonfiction thinks his essay “Politics and the English Language” is one of the best he produced. Among plenty of sound writing advice, the essay’s main point is that using stock phrases, lazy metaphors, and prefabricated sequences of words leads to sloppy thinking. We can smuggle bad attitudes and ideas into our writing if we use cliches we don’t interrogate.

I have learned this sensible idea of Orwell’s has taken on a postmodern academic form under the name “Conceptual Metaphor Theory,” and it has transformed into something that isn’t sensible at all. CMT supposes, for example, that using warlike language to describe argument (“win” and “lose” and “defend a point” supposedly constitute language that puts us in mind of the battlefield) makes us debate with a view to a kill rather than with an open mind and a collaborative truth-seeking attitude. And since this is a realm of academic theory, it doesn’t matter that there’s no actual empirical evidence for this kind of claim.

Take a recent preprint paper by neuroscientist Alexis T. Baria and Keith Cross, an education scholar, about “the social significance of the Computational Metaphor.” Considering things “according to conceptual metaphor theory,” the authors tell us that talking about the brain as a computer and vice versa may “cause harm.” So, they demand more investigation into the social effects of language comparing brains and computers, jumping off from “recently publicized concerns over AI’s role in perpetuating racism, genderism, and ableism.” They set out to determine whether this metaphor is “further de-centering marginalized folk who are already de-centered by tech-defined norms.” Our academic heroes find that it probably is. We find the metaphor all over. In lots of idiomatic speech, the computer is cast as brain-like and the brain as computer-like. We see this in phrases such as “I can’t process all that information” and “let me crunch the numbers” in the first case and in ones such as computer “memory” or “my computer is sleeping” in the second. And then, we get to the theory of how this metaphorical language causes “harm” to groups of special concern.

The primary worry is the term “artificial intelligence.” Apparently, “multiple scientists cited ‘intelligence’ as a problematic term.” The evidence offered: Artificial intelligence used for criminology seems to predict that areas with a recorded high crime are likelier to continue to have high crime, suggesting such areas may require a greater police presence. Thus, such AI perpetuates racism. Worse, calling computers “intelligent” risks furthering the notion that “people who are ‘good’ at math are of superior intellect compared to those who are not ‘good’ at math.” The theory here, to recap, is that people will combine noticing the word “intelligence” in “artificial intelligence” with the observation that computers are good at math, then jump to the assumption that anybody who lacks math skills lacks intelligence, and from there to some unspecified racial prejudice. This baffling and unintuitive chain of causality is, again, what our authors fear risks “further de-centering marginalized folk who are already de-centered by tech-defined norms.” Conceptual metaphor theory sure is theoretical!

If I want to sound folksy, I might write “yer darn tootin.’” If I want to appear morally vigilant (and hireable) in academic spaces, I might write “problematic” and “further de-centered” and “folk,” or for good measure, “folx.” As Orwell observes, it’s good to look closely at the baked-in ideas found in ready-made metaphorical phrases. But it’s possible to look so closely we start seeing things that aren’t there.

Related Content