Category: Language

NSFW: Swear Like a Victorian

I recently finished reading Melissa Mohr's delightful book Holy Sh*t: A Brief History of Swearing. Among other things, it's a fun, light illustration of some of the drivers behind what makes certain words taboo—granted, without discussing much in the way of formal linguistics—and should make apparent, in my view, how arbitrary many of those distinctions are.

To summarize her thesis, she divides curses into the holy and the shit: respectively, words and phrases that invoke religious ideas and bodily/sexual language. In the Middle Ages, religious oaths were thought to be much more offensive because they either forced God to bear witness to a lie, hence deceiving Him, or committed actual physical violence against Jesus, while words like fuckshit, and cunt were simply the most direct words for their referents. They most likely would not have raised an eyebrow.

During the Renaissance, however, sensibilities began to change. Religious curses, still reviled by some modern believers, nevertheless began to lose some of their shock value, making way for shit-words to claim a measure of notoriety for themselves, eventually culminating in the Victorian Era's extreme animus toward foul language. According to Mohr, this transition continues today, with one notable exception: racial epithets have now basically supplanted both the holy and the shit as the most shocking language.

Anyway, I highly recommend the book, and because I am both a lover of language and an overgrown man-child who's maintained his juvenile sense of humor despite being buffeted by the mentally erosive forces of time, experience, and a joyless commitment to pessimism, I'd like to reproduce a smattering of glorious Victorian sexual euphemisms taken from a chapter in Holy Shit titled "Gamahuche, Godemiche, and the Huffle".


You'll forgive me the absurd vulgarity, I hope. File: Wikimedia Commons

You'll forgive me the absurd, gleeful vulgarity, I hope.
File: Wikimedia Commons


Mouse over the references to see the truly hilarious original wording of selected definitions. Also, some of these words predate the Victorian era, but the chapter seems to imply they would have been in regular, if vulgar, use during Queen Victoria's reign.

Huffle, Bagpipe: blowjob

Gamahuche: fellatio, cunnilingus

Larking: some sources claim fellatio, but Mohr seems to favor Gordon Williams's argument that it means, to put it in words she does not, titty fucking. She also references an engraving called "The Larking Cull" (1800) which shows a man doing just such a thing. You can view it at the British Museum's website!

To Tip the Velvet: either French-kissing or cunnilingus

Covent Garden Ague: venereal disease

Covent Garden Abbess: bawd

Covent Garden Nun: prostitute

Godemiche: dildo

Lobcock: a big, rubbery one

Rantallion: I must leave it to Grose here again: "one whose scrotum is so relaxed as to be longer than his penis, i.e. whose shot pouch is longer than the barrel of his piece"

Fartleberries: dingleberries

Burning Shame: Grose defines this as "a lighted candle stuck into the parts of a woman, certainly not intended by nature for a candlestick"

The following were all slang words for penis. As with many euphemisms, some of these are more innocuous than others and probably depend largely on context, which might render them either as the uncomfortable, ill-willed curses I referred to in an earlier footnote or simply as crude and goofy:

Man Thomas

Euphemisms for vagina:

the Monosyllable
a woman's Commodity

Euphemisms for sexual intercourse:

Have Your Greens

Bubbies, Diddeys: breasts

Bushelbubby: a woman with large breasts




Julio and I Down by the Schoolyard: Stepping Down from the Grammatical High Horse

First and foremost, I'd like to thank Callum Hackett and Shannon Wittwer for their comments on this piece. You will find a couple of Callum's remarks, which he'd sent to me via email, reproduced in the footnotes. He was kind enough to proffer a few corrections and refinements.

Unless you're Dikembe Mutombo, avoid overwagging that finger.

A couple of weeks ago I wrote a Facebook status update that I want to address, if only because it allows for entry into a discussion about grammar, a topic that is widely misunderstood, and often most egregiously by those who fancy themselves grammarians. The post read as follows:

Sometimes! I want, to, read–back someone's? Statement to them, just. The way they've... Punctuated. It.

I want to expound a bit on why I wrote it, because it's really rather easy to be ruthless about language, to harbor an erroneous self-righteousness fueled by one's own presumed linguistic competence. As such, I feel genuinely bad that I may have contributed to an unfortunate tendency many English-major types exhibit: language elitism.

I’m no expert on language, English, or linguistics, first of all, so I am perhaps a bit out of my depth here. However, one of the most beneficial byproducts of having studied writing—an experience which, if you know me, is not one I often recount with dew-eyed fondness, for a number of reasons of varying validity—was an increased appreciation for the inherent ambiguity in language, an appreciation that rendered obvious just how unjustified I had been in acting the grammar scold for so many years prior. I was wrong, plain and simple.

(How did I mean that by the way? Are “wrong, plain and simple” part of a list? Or is “plain and simple” an authorial aside, signaled by a comma? If so, perhaps that last sentence would have been better written as “I was wrong—plain and simple,” or, “I was wrong. Plain and simple.” None of those are correct or incorrect, but you can see quite easily that a couple of options seem to work better than the rest. Why?)

To start, you have to realize that much of grammar you learned in elementary/high school is not "correct." It's not necessarily wrong either, but what was taught to us as a set of hard, fast rules of language were anything but: linguistic "rules" sprout from customs, and customs are subject to change. (In fact, there is no stopping this change.) Language is not math; it is not a logical system. You understand the sentences I am writing simply because thousands of years of ever-changing linguistic tradition (read: common usage) have evolved into the system of word usage and syntax we use today, one system among many hundreds of such systems, and what can be considered grammatical is more properly imagined as a gradient, rather than a set hemmed in by hard boundaries.

One way to look at grammatical as a concept is to say that something which is grammatical can be understood with relative ease without violating common practice in a way that produces friction for the listener. Here is an example:

Zoot and me run the Castle Anthrax.

You probably noticed the problem immediately: me is used in the subject where I would normally be indicated. If you remove “Zoot and,” you’re left with “Me run the Castle Anthrax.” Most likely, if you’re a native English speaker, this formation doesn’t sound very good, and might even conjure an image of a non-native speaker making a rather simple mistake. Therefore, it is ungrammatical. But it’s important to realize that using me in this position is not objectively wrong in any way. Rather, the preference for I in the subject is one based on a custom in which the subjective and objective forms are differentiated, in this case, to signal the nature of the pronoun. Another common couplet of confusion is who/whom, where who is used in the subjective (nominative) case and whom in the objective (dative or accusative). Because this differentiating doesn't do much real work anymore, it is beginning to be sloughed off, and you'll hear things like, "Sir Robin will bravely run away from whoever challenges him," all the time.

Old English (OE), in fact, exhibited a rather complex system of conjugations and declensions—think of the latter as conjugations (inflections) for nouns, pronouns, and adjectives—many of which might seem anathema to us. Yet it is out of this deep tradition that some of our current linguistic customs sprouted, not to mention about 30% of our contemporary words. (The other 70% are drawn from French and Latin, for the most part.) In OE, the form of a pronoun would vary based on grammatical gender, a custom languages like French, Spanish, and German retain; and the form of a noun or adjective would change based on its role in a sentence, rather than its position, in addition to grammatical gender. We still retain certain aspects of grammatical case from these traditions (e.g., children’s is the genitive inflection of children; he/him/his or she/her/hers are all different inflections signaling different information about the pronoun).

Take, also, our basic sentence structure in its simplest form: subject, verb, object. SVO. “King Arthur addressed the old woman.” But, in OE, if you were to tack an introductory clause onto the beginning of that sentence, it may have looked something like, “When he the crone approached, King Arthur addressed the old woman.” SOV. Subject, object, verb. But why? I don’t believe anyone really knows, but that was permissible. In many clauses, dependent or independent, word order could take on either SOV or SVO. Over time, as English changed, SVO and SVC became more prevalent, and Modern English is less flexible with regard to word order. Now, we likely equate SOV structure with archaic language, probably using it occasionally when attempting to affect the speech of a bygone era.

I have digressed somewhat from my main point, but I wanted to demonstrate, in a very brief and simple manner, that the strictures and structures of our language are, in many non-trivial ways, arbitrary. What has produced the emergent phenomenon of intelligible communication via language is practice, not logic or mathematics. And when a person is not careful, they may venture into some unsavory waters, for rooted in much grammar scoldery are a number of potentially injurious biases, among them class-based and racial prejudices against certain writing styles or "errors" and, in oral communication, dialects, accents, or speech patterns. The faulty assumption, of course, is that someone who doesn’t speak or write “properly” must be stupid, but one need think only for a moment to realize that linguistic practice is overwhelmingly predicated upon the linguistic conventions one grew up with during childhood and access (or lack thereof) to institutions that prescribed or taught the standard language. (In the United States the standard language is something close to a Middle American accent—clearly annunciated and adherent to the syntactic traditions in English currently viewed as “proper,” though you could certainly speak in standard grammar using a regional accent. The distinctions between writing and speech, or accent and dialect are important, but for the purposes of this post, I'm more concerned with linguistic practices viewed as unsactionable by presumed authorities and majority groups.)

So why does punctuation work at all?

"If it's not done by sunrise, I'll cut your balls off."

"If it's not done by sunrise, I'll cut your balls off."

Consider the following two sentences and how they are different:

The centurion saw the young man who was painting the side of Pontius Pilate's palace and thought it best to teach the lad a lesson in Roman grammar.

The centurion saw the young man, who was painting the side of the Pontius Pilate's palace, and thought it best to teach the lad a lesson in Roman grammar.

Obviously, these are very similar sentences, but they are not the same. They are both relative clauses, yes. The first sentence, however, contains a restrictive relative clause (RRC): "who was painting the side of Pontius Pilate's palace." The second contains a nonrestrictive relative clause (NRC) comprised of the exact same words, with the notable difference that they are offset by commas.

In the first sentence, the clause is restrictive because it provides indispensable information about the young man; it actually acts to identify him explicitly, as if to suggest that out of all the young men in the plaza, the young man we are concerned with is the one painting the side of Pilate's palace. You cannot remove this clause without altering the meaning of the sentence.

In the second sentence, the clause is nonrestrictive because it is providing nonessential descriptive information.  Implicitly, we already know which young man we're dealing with, but we want to point out that he was painting the side of Pilate's palace, because it provides both extra detail and vividness to the scene as well as context within which we can understand the impetus for the centurion's interference. However, if you removed the clause from this sentence, its basic function would not be changed.

Now, assume you meant to write something more akin to the second example but left out the commas so that your sentence looked like the first. There is, all of a sudden, a disconnect between the message you were attempting to impart as an author and what the reader is likely to interpret as your message. In this case, the risk of being misunderstood is rather trivial: your scene is still mostly intact, and will permit the slight shift in meaning while remaining intelligible.

With this in mind, look back at my original post and consider again why it is so jarring, prompting, as one commenter said, an unpleasant soundspace inside their head. One reason is that, contrary to what we were taught in high school, commas and other punctuation marks do signify pauses. Punctuation is, after all, rhetorical, or should contain a considered rhetorical element if you wish to write well. The standard line we were taught—that punctuation exists primarily to separate sentence constituents—is not the gilded-gold rule it was presented as. It is, however, useful as a general guide, and serves an important purpose, because, yes, knowing how sentence constituents fit together will help you decide how to construct phrases and clauses, how to punctuate them to stress, de-stress, or identify certain elements, and how to decipher what other writers you trust to be thoughtful about their punctuation might mean to say when they decide to punctuate a sentence in a certain way.

Bad punctuation leads to ambiguity, and our customs of punctuating—the common, widespread usage of those customs, that is—allow us to partially focus vague and oblique thoughts into smaller, less variable gradients of meaning. The punctuation in my original post not only violates the customs we are familiar with, it possesses no internal logic of its own. If the word order was not picked carefully, to remain clear despite the strange punctuation, the very meaning of the sentence might be even more difficult to decipher for the reader.

This post was not meant to provide a comprehensive discussion of punctuation, word order, or grammar. Not at all. The examples I've used are extremely simple ones, and you will find examples of more complex variations on these basic structure more numerous than the illustrations I've used here.

I am not above the wave of self-satisfaction that swells as a result of correcting someone's "errors" (which, really, is more akin to identifying inefficiencies or ambiguities, and sometimes, just derivations from the norm), nor am I a model of ego restraint. But I think it is very important to understand what is constructive when it comes to talking about how to use language and what is not, and you need neither to be a linguist nor to understand much more than the basic difference between descriptivism and prescriptivism to appreciate why being a grammar scold is unbecoming for an intelligent, literate person.

Controlling for exposure, do stupid people use poorer grammar than smart people? Maybe, but many smart people use poor grammar too. Practically, it's impossible to generalize. The fact of the matter is that if you grew up and received an education that placed a premium on teaching grammar, you will most likely have a better grasp of the standard language than someone who did not receive such an education or who grew up in a region exhibiting different linguistic practices. You can look no further than American Black English or Appalachian accents for such examples of language traditions that exhibit oft-ridiculed customs and rules of their own, and, hopefully, you can see that grammar scolding becomes quite illogical in light of this evidence.

For an excellent discussion of why, I highly recommend the episode of Slate's excellent podcast Lexicon Valley that features John McWhorter, author of Our Magnificent Bastard Tongue, speaking about why people 1) like to jump all over grammatical mistakes, and 2) should not to be considered savory or helpful for doing so.

The temptation to be a dick about language is strong, and yes, many people probably ought to learn to write more clearly, to acquire a command and an understanding of how language works within the larger system (in the standard language, as well as the push-and-pull between and across dialects). Ambiguity can certainly be detrimental to civil discussion. So too, though, can the militant scold.

Mammoth Reads: Attraction, Death, Medicine, and Punctuation

Baby, You Can Drive My Car

Here's an interesting one:

The present study experimentally manipulated status by seating the same target model (male and female matched for attractiveness) expressing identical facial expressions and posture in either a ‘high status’ (Silver Bentley Continental GT) or a ‘neutral status’ (Red Ford Fiesta ST) motor-car… …Results showed that the male target model was rated as significantly more attractive on a rating scale of 1-10 when presented to female participants in the high compared to the neutral status context. Males were not influenced by status manipulation, as there was no significant difference between attractiveness ratings for the female seated in the high compared to the neutral condition.

On first glance, this doesn't seem all that surprising.  The evolutionary conjecture probably goes something like this:  Traditionally, males of the species are responsible for wooing their female counterparts by way of impressive feats, activities that showcase the male's ability to build a home, hunt prey, or exhibit brute strength; females therefore instinctively pick up on these sorts of success queues from men.  Males, on the other hand, choose their female targets based on the perception of fertility, normally showcased by the female via purely physical traits; males are therefore queued into females' physical characteristics rather than their possessions of status.  So a nice car wouldn't affect a male's perception of attractiveness, whereas it would a female's.

I can't promise my parsing is correct, mind you.  I'm sure an evolutionary biologist would have a thing or two  to say about it, my assessment of which is based on a non-trivial number of hours spent watching nature documentaries and some fairly light reading on the subject.

Putting 9/11 on the Backburner

Robin Hanson is pulling me in two directions again.

Here's the part of his recent post entitled "Forget 9/11" that I agree with:

Yet, to show solidarity with these three thousand victims, we have pissed away three trillion dollars ($1 billion per victim), and trashed long-standing legal principles...

Here's the part I don't:

...And now we’ll waste a day remembering them, instead of thinking seriously about how to save billions of others. I would rather we just forgot 9/11.

Do I sound insensitive? If so, good — 9/11 deaths were less than one part in a hundred thousand of deaths since then, and don’t deserve to be sensed much more than that fraction. If your feelings say otherwise, that just shows how full fricking far your mind has gone.

(Click the link in that excerpt, as you might not get the gist of that sentence unless you read Hanson's previous post on near vs. far thought modes.)

Hanson's anger regarding the disproportionate weight we put on native deaths is well taken.  That the World Trade Center bombing provided such potent imagery, seared into our brains by nearly constant coverage, does not help us look past all of the impotent memorializing ten years later.  I am still haunted by the image of people throwing themselves from the towers as flames devoured the upper floors, not because these people were Americans but because they were human beings.  Human loss is difficult to swallow, and it's worse to swallow when we see it close to home.  But we do disproportionately realize these sufferings:  Consider the 12 million people threatened by famine and sickness in Africa right now (4 million in Somalia alone), or the ongoing cholera outbreak in Haiti, or any of the other billions of people who live in  relative poverty, faced with the prospect of dying every single day.

What Hanson doesn't seem to appreciate in the context of his post, though I doubt the distinction is entirely lost on him, is the potential for 9/11 to remind us of our responsibilities to remember this greater scope of human suffering.  September 11 ended the naive dream many of us were living (my high-school self included) that saw us safe and secure and indefinitely prosperous, that acknowledged specters of violence like those we saw on the news in places like Lebanon and Israel as mere theoretical risks.  To forget the slide the WTC attacks precipitated, however, would be foolish.  Hanson himself mentions the wasted treasure and degradation of legal principles we witnessed; what makes him think forgetting all of this would somehow benefit us, I don't know.

Why forget and pretend that 9/11 was a blip on the radar?  Why not simply remember it as the complex story of pain, strength, political malfeasance, paranoia, and cultural shift that it is?  I can't imagine that Hanson fathoms death as the only salient metric by which to judge history.

The caveat, of course, is that 9/11 should not be used as an excuse to stagnate; unfortunately, these events, and the dead, have been used as political capital by pretty much everyone in Washington.  So, in that sense, we are not remembering 9/11 correctly or constructively; I'll give Hanson that, wholeheartedly.

Screening for Breast Cancer

The NEJM has an interesting piece on clinical guidelines for breast-cancer screening.  You may remember that the U.S. Preventative Services Task Force recently changed the mammography recommendations for women in their forties, supporting a reduction in the number of scans, even while finding that regular scans reduced mortality in this demographic by 15%.

The author of the NEJM piece, Dr. Ellen Warner, has this to say:

How should one approach the question of screening mammography in a patient in her 40s, such as the woman described in the vignette? The decision should be individualized, with the recognition that the probability of a benefit is greater for women at higher risk. This patient has no major risk factors, such as a family history of breast cancer or a history of a premalignant lesion on biopsy, that would put her at even moderately increased risk. Her chance of having invasive breast cancer over the next 8 years is about 1 in 80, and her chance of dying from it is about 1 in 400. Mammographic screening every 2 years will detect two out of three cancers in women her age and will reduce her risk of death from breast cancer by 15%. However, there is about a 40% chance that she will be called back for further imaging tests and a 3% chance that she will undergo biopsy, with a benign finding. Lifestyle modifications (e.g., weight control and avoidance of excessive alcohol consumption) that might lower her risk should also be discussed.

Read the whole article for a discussion on the evidence regarding breast-cancer screening; it's a complex issue, and considering the backlash in response to the new recommendations, it's worth reading about the sorts of observations and evidence that go into producing clinical guidelines.

Oxford Comma Blues

I'm going to tell you this once:  use the Oxford comma.

Why?  Well, imagine we have a list of things which includes:

  • Merle Haggard's ex-wives
  • Kris Kristofferson
  • Robert Duvall

That list is taken from a newspaper article the linked-to Language Log blog cites as having incorrectly, or ambiguously, punctuated the listed items in a picture caption showing Merle Haggard.  In the context of that caption, the sentence can be written one of two ways, depending on which theory of serial commas you prefer:

  • Among those interviewed were his two ex-wives, Kris Kristofferson and Robert Duvall.
  • Among those interviewed were his two ex-wives, Kris Kristofferson, and Robert Duvall.

Which do you think is correct?  If you guessed the second sentence, congratulations!  The first sentence clearly reads as if "Kris Kristofferson and Robert Duvall" is an appositive renaming "ex-wives", when clearly those two are separate items in the list.  The only way to punctuate this sentence without ambiguity is to include this final serial comma, the Oxford comma.

Some publications traditionally omit the final serial comma, and while most lists don't lend themselves to the sort of ambiguity seen above, there will be instances in which an Oxford comma is necessary to preserve the writer's intended meaning.  But because publishers like to be consistent, and because consistency, to a publisher with low regard for this type of punctuation, will mean including a list with the serial comma omitted, editors will likely ask for a rewrite and end up wasting a bit of time. The simple answer:  just use Oxford commas all the time.  You will never be wrong, and if you ever need to equate the final two items in a series for any reason—omitting the comma also insinuates that the final two items are more closely related  to one another than they are to the other items—you can always leave it out.  Stylistically, you'll have more leeway.

Click through to the Language Log post for another funny and improperly punctuated list.  I'll give you two of three items from that one:  "Nelson Mandela" and "dildo collector".