By Caroline M. Cole
A National Council of Teachers of English (NCTE) discussion on LinkedIn asked members to list their “BIGGEST grammar pet peeve… the one you would seriously consider giving real years of your life just to get students to understand, and do correctly.” Generating over 150 comments, it has been one of the most popular NCTE discussions on the site. The most common complaint seems to focus on confused homonyms, such as it’s and its; your and you’re; and there, their, and they’re. Other criticisms, however, include inaccurate use of commas and semicolons; split infinitives; comma splices; inappropriate pronoun/antecedent and subject/verb agreement; run-on sentences; double negatives; incorrect descriptive and definitive adjectives; misused –ly adverbs; absent articles; starting sentences with coordinating conjunctions; run-on sentences; missing or misused apostrophes; and the incorrect use of transitive and intransitive verbs, such as lay/lie, raise/rise, or sit/set.
Such criticisms, especially when made by teachers, usually evoke a backlash. Arguing that communication is about conveying ideas, many retort that, if an audience understands the writer or speaker’s intentions, the need to maintain archaic or elitist rules isn’t important. Others claim language evolves and that it’s just a matter of time before various grammar rules and conventions outlive their use and are forced into retirement, as exemplified in Megan Garber’s argument about “America’s least favorite pronoun” in The Atlantic article “For Whom the Bell Tolls.” Then, of course, there are those who say that, as long as technology continues to create and shape how we transmit information, we will see new ways of using, and dismissing, conventional practices of representing language.
Provocative as the debates on both sides may be, they remain grounded in problematic assumptions, including: there is a “correct” version of English (there’s not); linguists, language aficionados, and teachers have been and remain the guardians of the language (they haven’t been, and still aren’t); decisions about which language “rules” to uphold or condemn should be decided by the way we speak (like, um…they shouldn’t be, ya’ know); and it’s the masses who ultimately decide which rules will stand the test of time (they don’t, at least not always).
Assumptions aside, the topic of grammar merits attention if only because debates about its alleged uses and misuses conflate social, political, and economic arguments about intelligence, competence, and prestige in ways that affect how individuals are perceived. But there are other reasons for “doing grammar,” even if they get little recognition in current debates about language.
Ask people for their thoughts about grammar and the comments will run the gamut. On the one side are those who view grammar as a manifestation of “clear” thinking and writing. On the other side are those who see grammar as an irritant that’s forced upon language learners of all ages via memorization and other “drill-and-kill” lessons. In between are those who suggest that, while grammar is good to know and use, nowadays it’s just a quaint reminder of the things that used to be necessary to communicate—like quill pens, writing slates, and typewriters—but from which we have moved on. Yet the origin of and reasons for grammar are much deeper and richer than these comments suggest, and it is within this history that we see how grammar is invaluable even today.
The term “grammar” can be traced to Greece during the 5th century b.c., where it was used to describe the skill, expertise, or knowledge of someone who knew the “letters”; that is, an individual who could not only read and write, but who could set apart vowels, consonants, semi-consonants, syllables, and other language units—a formidable task in grammar’s infancy, when people had words to convey ideas but no way to explain how or why particular sentence structures, word combinations, morphemes, and so on were more effective than others. Such distinctions may seem insignificant today, but this knowledge was invaluable to a civilization that invested heavily in a person’s ability to engage, inform, debate, and refute others.
In Ancient Greece, the agora, the central gathering place, was the classroom, and philosophers like Protagoras, Gorgias, Socrates, Plato, and Aristotle guided those who assembled in testing the merit and truth of ideas concerning governance, law, philosophy, art, science, religion, and warfare. Orators who understood how the language worked could sway audiences in these matters—sometimes in spite of their argument’s value or credibility—shaping not only the mind, body, and imagination of audience members, but the future of Athens as well. Grammar gave speakers that understanding.
In “The Birth of Grammar in Greece,” Andreas U. Schmidhauser traces the ways the Ancient Greeks developed systems for identifying and working with various components of language. According to Schmidhauser, the Sophists (fl. 450b.c) and those who followed began dividing words into categories with specific functions, grouping words into sentences, finding ways to connect sentences to build arguments, developing a science of sound, and using language to test the truth and falsehood of statements, ultimately laying the foundation for our current notions of grammar. Their efforts to develop a proficiency in grammar, however, were not motivated by a desire to define “correct” uses of the language. They were philosophers—seekers of wisdom—interested in addressing the problems in their society, and they were keenly aware that the most successful orators were those who could capture the attention of audience members and motivate them to action. They likewise understood that subtle variations in how they presented information could affect the audience’s perception and, hence, reception of their message. As such, the Ancient Greeks approached grammar rhetorically, not prescriptively.
Identifying discrete blocks of language and their function, the Ancient Greeks acquired a body of facts on the various constructions, forms, and uses of words in a sentence, and the resulting repertoire allowed them to then select combinations that were most effective for the purpose, audience, and context of their message. Descriptive grammar thus became one more tool in the Ancient Greeks’ toolbox for building and presenting successful arguments. Unfortunately, rhetorical approaches to grammar were increasingly downplayed, subordinated and, at times, lost all together as other cultures adopted the Greeks’ approach to language classification.
Consider, for example, Latin. Scholars developed a grammar for Latin in part to ensure that government and military treaties, scholarly and scientific work, religious doctrines, and other official documents that were written during the Roman occupation could be read, understood, and disseminated as the Empire began to crumble and people returned to their native dialects. The resulting grammar provided a Rosetta Stone of sorts, making it possible for later generations to learn and work with materials written in a “dead” language; it likewise gave people a common language for the times different dialects or languages might have hindered communication. But documenting the structures of a language that was no longer spoken and, hence, no longer evolving shifted descriptions of how people used Latin to prescriptions of its “correct” use.
The English language also saw shifts from descriptive to prescriptive uses of grammar, albeit for different reasons.
In 1066, when William the Conqueror became the first Norman king of England and established his seat of government in London, he replaced the Anglo-Saxon chiefs and clergy with French-speaking Norman warriors, infusing the English language with French vocabulary and idiomatic expressions. Reclaiming the throne in the thirteenth century, the English continued to use London as the center of government, but they transcribed old documents and began writing new ones in the dialect of English used at court and by the ruling classes in the area. The newly established universities at Oxford and Cambridge (both fewer than 60 miles from London) helped reinforce the norms of English that were concentrated in this region; language variations, however, continued to flourish as writers, transcribers, and translators incorporated their own preferences into handwritten manuscripts—at least until the advent of the printing press in the late 1400s.
Increasing the speed with which information could be produced and disseminated, printing presses were instrumental in stabilizing the language. In addition to minimizing discrepancies across multiple copies of a single work, the presses made it possible to put the same version into more people’s hands, indirectly promoting more consistent uses of the language. Further reinforcing consistency in English were the printers themselves, who pushed for more uniform spelling and language structures to make it easier to typeset manuscript. But it was the amount of information printing presses were able to generate that seemed to have some of the most profound effects on the origins of English grammar.
In a discussion of the history of English, Susanne Kemmer, Associate Professor of Linguistics at Rice University, explains it was the growth in written language that generated a need to develop resources that could help non-native and native speakers learn more about how the language worked. While some countries established academies to codify and normalize all aspects of their language (for example, the Accademia della Crusca and L’Académie française)—a practice of language regulating that continues today—Kemmer notes that such efforts did not take hold in English-speaking nations, even though there were (and remain) “language purification” movements that sought to minimize foreign influences on the language. Instead, dictionaries were being compiled, printed, and distributed, helping the individuals who were learning the vocabulary of a new language, as well as the English speakers who were wanting assistance with “hard words” which, according to Kemmer, were primarily loan words from Classical languages and other countries. But as much as these materials provided opportunities to fix the language, there were those who cautioned against definitive, prescriptive views of English. Samuel Johnson, who assembled A Dictionary of the English Language, was such an individual.
In the Preface to the 1755 edition of his work, Johnson explains that, despite the years he spent trying to bring order to, among other things, the orthography, etymology, signification, and function of the language in use, it was futile to believe it we could “fix our language, and put a stop to… alterations,” for numerous factors would continue to change the language. For example, there was commerce. Johnson acknowledges that business would bring together individuals with different backgrounds and languages and, as the participants “learn a mingled dialect,” English would change. Johnson notes that education would also change the language, as “those who have much leisure to think will always be enlarging the stock of ideas, and every increase of knowledge, whether real or fancied, will produce new words, or combinations of words.” Learning other languages will also affect English by encouraging unique ways to view the world and, thus, communicate those observations. Johnson concludes that “no dictionary of a living tongue ever can be perfect,” even if someone were to spend a lifetime addressing syntax and etymology. And yet, more than 250 years later, we still see debates about “correct” language use.
Using commonly known language patterns can help streamline the learning and use of language, but spending “real years of…life” to help someone understand any single grammar rule suggests a misunderstanding of the real value of grammar–especially when we consider that communication is still possible in the midst of unconventional, or even “wrong” uses of grammar. This argument is not to suggest that grammar has outlived its function. On the contrary, grammar is necessary if we are to have a system of communication that others can recognize, understand, and use to interact with others. But there are better reasons for learning grammar than simply avoiding the proverbial red pen.
Rather that pushing grammar as a set of rules to uphold at any cost, we would be better off promoting grammar as a knowledge and understanding of where and how words enter into which patterns and where alternate patterns exist. After all, it is with this information that we can, among other things, make more strategic choices in conveying intention, probability, permission, conditions, and suppositions; we can find ways to attribute or deflect responsibility; we can adjust the distance between ourselves and our audiences and, by extension, the level of intimacy we create; we can identify purpose and result, cause-and-effect, and concessions; we can emphasize and subordinate information more effectively; and, we can create stronger cohesion between and among our ideas.
The Ancient Greeks were committed to engaging with and motivating audiences about topics and concerns that affected their society, and a rhetorical approach to grammar gave them another tool to do so. The fact that we continue to read, cite, and draw upon their arguments centuries later is a testament to the power of using grammar in this way.
We can tap into this power by moving beyond the prescriptive views of grammar that dominate today’s conversations of language and, instead, by working to develop a greater awareness of and appreciation for the various ways information can be presented. In doing so, we add to our communication tool boxes, thereby helping us construct messages that can impact the audiences we aim to reach.
Working toward Areté…
What are your experiences in learning or working with grammar? Share your thoughts below.
Pingback: Strive for Excellence Not Perfection in Communication