Pop CultureSkepticism

Language Myths

As skeptics, we are dedicated to fighting woo and misinformation, however many educated, kind, intelligent people overlook one of the worst kinds of woo and use it to discriminate against others. What am I talking about? Language woo. Misperceptions and misunderstandings of what language is and how it works. I’m going to lay out a few myths about language that I’ve heard otherwise wonderful skeptics professing, and try to explain from the perspective of a linguist what’s wrong with the underlying premises. As skeptics, we should be wary of all kinds of misinformation, not just worried about bigfoot and anti-vaccers, so let’s start with the most basic of human traits: language.

Myth #1: African-American speech is uneducated and sloppy. If people from bad neighborhoods want to succeed, they should learn how to speak properly.

Fact: This misperception gets to the heart of language woo. Most linguists (those people who study language and how it functions for a living) are descriptivists, not prescriptivists. That means they treat language as a science, looking to describe how actual users function with actual language, instead of trying to understand an ideal system of rules and impose it on users. There is some variation here, but nearly all linguists agree that African American Vernacular English (and many other accents and dialects with negative stereotypes) are 100% grammatical: they simply have a DIFFERENT grammar from standard English. The kicker here is that these speakers are not breaking rules or being lazy, they’re simply following a different set of rules, but it’s still one that is perfectly consistent, intelligible and functional. Asserting that these speakers are lazy or stupid misunderstands how they are using language and how that language is varying from the standard. In fact, standard usages are relatively arbitrary, and usually only based on the variety of language used by the wealthy or educated. Shaming those who don’t use the standard is simply telling them that their rules and behaviors are unacceptable because they are not the same as another person who has more prestige or wealth.

Myth #2: There are certain rules that one needs to follow when writing, such as “no split infinitives” or “don’t end a sentence with a preposition”

Fact: These rules are 100% arbitrary! They actually were not originally part of English, but around the time when dictionaries began to be created, some language enthusiasts decided that English should be more like Latin (because let’s face it, Latin is pretty fantastic). These rules were created because it’s impossible to do these things in Latin, and these thinkers wanted to turn English more Latin. In terms of intelligibility, economy, functionality or the ability of your sentence to communicate a meaning, these rules do absolutely nothing. Again, this misconception comes from the idea that certain languages or dialects are better than others. However there is absolutely no evidence that any language is better than another. People may have personal preferences, or in certain circumstances one dialect or language may be more appropriate, but this doesn’t make one language inherently better. Just like having a favorite cookie doesn’t make that cookie inherently better, so it is with languages.

Myth #3: Language today is sloppy, uses words improperly, and introduces all these new-fangled words that are bad for the language. We’re going downhill!

Fact: Language changes ALL THE TIME! And it’s FANTASTIC! While some people may frown at the fact that users change the meaning of words or coin new words, this is actually how a language grows, adapts and stays alive. Shakespeare coined new words all the time, and many of his words are in common usage and considered acceptable today. Many words we use today have actually taken on a definition completely opposite from what they used to mean, and yet we still understand each other and accept those words. Peter Trudgil points out “”none of us can unilaterally decide what a word means. Meanings of words are shared between people–they are a kind of social contract we all agree to–otherwise communication would not be possible.” As long as a meaning is intelligible, it’s a perfectly acceptable usage of language. These changes happen all the time, otherwise we’d still be using kine instead of cow.

There are many other variations on these same kinds of myths that pop up all the time. However the overall response to all myths that suggest there’s a proper way to speak or that you should judge someone based upon their language is that language is a tool for communication. It varies based upon users. If language communicates, then it is functioning just as it should. In order to function, users may be lazy and create new ways of saying things that cut out some of the syllables. Or they may be creative and introduce new ways of saying things. Different communities may follow different rules. And as time passes, language changes in order to adapt to new people, new technologies and new situations. All of these things are appropriate and helpful for our communication. There are many examples of negative stereotypes towards language being linked with racism, classism and discrimination. As skeptics, we should take the time to educate ourselves on the science of language so that we are not making unfounded judgments about the “correctness” of other people’s language. If you’re interested in language tolerance, check out lesserjoke, who does a number of posts about language myths. If you have the resources, I’d also suggest taking a basic linguistics or sociolinguistics class which can give you all the basic resources to understand language variation and usage, and how it might affect your perception of other people.


Previous post

Teen Skepchick Interviews: SB Morgaine

Next post

The U.K.’s Sonic Screwdriver Prototype and What It Means for You as a Person



Olivia is a giant pile of nerd who tends to freak out about linguistic prescriptivism, gender roles, and discrimination against the mentally ill. By day she writes things for the Autism Society of Minnesota, and by night she writes things everywhere else. Check out her ongoing screeds against jerkbrains at www.taikonenfea.wordpress.com


  1. May 10, 2012 at 3:31 pm —

    Another claim that I have heard called a myth–Is it true or not that there is a “window of opportunity” for first language acquisition in childhood, and if that is missed, a person’s ability to communicate in any form will be seriously impaired? Jacob Bronowski mentions this in one of the old _Ascent of Man_ episodes.

    • May 10, 2012 at 4:04 pm —

      The language acquisition period is not a myth according to most linguists, however there is a great deal of debate about it. I know primarily about second language acquisition not first, but I assume there would be some crossover. In terms of second language acquisition, if you start learning a second language after about the age of 12 (although there is a huge amount of debate about when the window closes and some linguists think it’s as early as 2), it’s highly unlikely that you will ever be as fluent as a native speaker. It also means you have to learn the language in a different way. There aren’t really any studies of people who have had no exposure to language, because it would be inhumane to do that to anyone, so it’s pretty uncertain as to whether someone would be able to communicate at all.

      That question gets into debates about universal grammar and the language acquisition device. If you want to know more about that, look into Noam Chomsky or language acquisition. Most linguists believe that some amount of language is hard-wired into the brain, but that the ability of the brain to naturally pick up a grammar goes away after a while. Cool stuff 🙂

  2. May 10, 2012 at 7:59 pm —

    I give you the spectacular Stephen Fry on language rules:

  3. May 12, 2012 at 9:56 am —

    The problem with the descriptivist argument is the people who use words that are just flat out wrong, then go ‘language evolves!’ when you call them out on their utter lack of understanding of the words they’re using. Then, inevitably, language proceeds to evolve anyway, and they end up being right.

    That language evolves does not in any way mean you can make up whatever preposterous sounds appeal to you and be ‘right’.

    Further, language evolving is not necessarily a good thing. ‘Irregardless’ simply should not be a word. We have two words for that already, and ‘irregardless’ serves only to allow people who lack the language skills to indulge in grandiloquence the capacity to do so, and is a magnificent way of detecting those who are simply trying to appear more intelligent than they feel their vocabulary will allow from those who understand what they’re saying. This isn’t an example of a word developing to fill a gap, this is an example of enough people being stupid that various dictionaries had to acquiesce to their ignorance.

    Not to mention, the shift over time of the word ‘titular’ to mean ‘eponymous’, which is indeed worse than the initial example, as ‘irregardless’ was not, in the first place, a word at all and as such its original meaning cannot be lost. This is, however, the case with ‘titular’, a sentiment that cannot be conveyed as easily with any other words currently in existence, leading to the need to invent a word to replace titular, merely because people who didn’t know what they were talking about decided it ought to replace ‘eponymous’.

    Language can, and should, evolve, but we ought retain at least enough control that we not end up with the linguistic equivalent of the platypus.

    • May 12, 2012 at 12:36 pm —

      There are TONS of words that evolved without there being an empty space for them to fill. We have doubles for a number of things, and so do other languages. Why should that be a problem? It gives us a variety of ways to express ourselves.

      In terms of whether a word is “correct” or not, linguistically speaking if you can figure out what a person means, then it’s correct. And you know what “irregardless” means. It’s not laziness, it’s not grandiloquence, it’s how language evolves. Speakers use words in new ways, speakers add different prefixes and suffixes to things. Yeah, that word will become common usage, and in 50 years no one will care that it sounded bad to people today because it is entirely natural for speakers to innovate and create new words, even words for things that we already have a word for.

      Your second example doesn’t prove that we needed to retain the original definition of “titular” at all. There are many concepts that we don’t have single words for in the English language, and we all get by just fine. The idea that there is something inherently good about the way language used to be or is now that is going to be lost or tainted is simply not true. Language was no more perfect at expressing our sentiments in the past than it is now. We make it fit our needs by innovating these words, and sometimes we lose a word that might have been interesting or helpful, but things spring up to replace them. It happens ALL THE TIME.

      If we were to take your argument, we should still be using the word “kine” instead of “cows” because who would use a word for something that’s already got a name? There’s nothing wrong with the word cows though, and none of us mourn kine. People have been lamenting the downfall of language pretty much since language has started existing, but here we are still able to communicate with relative ease. I also think it’s important to note that English is already a linguistic platypus. Some scholars believe that English evolved from a creole, which is two languages mashed together that evolved native speakers. We borrow from other languages all the time, both grammatically and in terms of vocabulary. English is a complete mish-mash of phonetics, syntax, semantics and morphology. But you can speak it just fine, you can understand others, and it functions with absolutely no trouble. Why should it be a problem that it came from a variety of sources?

Leave a reply