“…he remembered his uncle saying once how little vocabulary man really needed to get comfortably and even efficiently through his life, how not only in the individual but within his whole type and race and kind a few simple clichés served his few simple passions and needs and lusts.”
—William Faulkner, Intruder in the Dust, 1948
Avoid clichés like the plague.
This sentence appeared on a poster in my elementary school classroom. When I first read it, my initial impression was one of confusion: was the plague—as in, the Black Plague—a cliché? As a novice to language (and a Millennial to boot), I failed to understand that the ‘like’ in the sentence modified the verb ‘avoid’ and wasn’t a synonym for ‘such as.’ Moreover, the notion that one should avoid something as one would the plague was no longer a recognizable cliché, so even once I figured out the meaning I remained ignorant of its supposed cleverness.
The point is that clichés change with eras, with developing grammatical shifts, with political climates, with over- and underuse, with cultural relevance—so while it may be good advice to avoid clichés like the plague, it is useless if we don’t have a strong grasp of what makes a cliché a cliché and what effect they have on what we’re expressing. Some clichés persist for so long, and their constructions so seemingly astute, that they graduate from tired platitudes to almost truisms.
There is more danger in certain clichés than the risk of confusion, or the laziness of pat phrases—some of them perpetuate really lousy ideas that, if you stop to think about their implications for a few minutes, don’t hold up to scrutiny at all, and seem in fact to be effective only because they’re clichés, so common that people forget to question the inherent philosophies underneath them. Here are a few dangerous clichés that I hope we stop using—or at least cease employing them so reductively.
Don’t bite the hand that feeds you.
Ostensibly, this is an aphorism about gratitude, but it’s really more about compliance and deference and power. It’s a powerful metaphor, no doubt, but it seems to forgo any consideration of the hand and places all the onus and agency on the one being fed. How many people are abused and disempowered by the very ones on whom they depend for sustenance and survival? How many people in lowly positions (in a family, a relationship, a community) have had this cliché thrown at them as soon as they object to their treatment? How many legitimate claims of abuse and corruption have been dismissed in the platitudinous chasm of such a phrase? Sometimes biting the hand that feeds you—standing up for yourself against even those forces who “provide” for you—is the bravest and wisest choice a person can make. Bite those damn hands, rip them savagely from their wrists, if they don’t allow you use of your own.
What does not kill you makes you stronger.
This is usually attributed to Nietzsche, who did, in fact, write something along these lines in Twilight of the Idols, but its ubiquitous proliferation has come from the profusion of its paraphrasing. From G. Gordon Liddy to Conan the Barbarian to Kelly Clarkson, Nietzsche’s aphorism has become one of the most pervasive ideas in the world—and one of the most accepted. To be sure, there is a positive element to this, a kind of consolation for having gone through something traumatic or unfair, a way of saying, ‘Hey, I’m sure that really sucked, but at least you learned something from it.” But the sad truth is that what doesn’t kill you now will, eventually, kill you. Besides, there are people whose lives have been nothing but forces that almost—but didn’t—kill them, and to toss such an empty sentiment at such a life would be, in my mind, offensive. After all, what point is there to strength in a life of constant pain and suffering? The strength gained from these experiences would only be valuable if the sufferer was able to then use their newfound might for their own purposes. Otherwise, this cliché comes across as a dismissal of the severity of the trauma, a hifalutin euphemism for ‘get over it.’
What We're Reading This WeekGet recommendations for the greatest books around straight to your inbox every week.
Curiosity killed the cat.
Now this one is just stupid. Curiosity is one of the great virtues of humanity, and to try to warn of its ramifications seems like the terms of a dictatorship, as in, i.e., don’t ask too many questions or you’ll draw unwanted attention. Moreover, a cat’s curiosity is, in this context, indiscriminate and ignorant—it is rather like a blind man’s stumble through an unfamiliar room. In life, our inquisitiveness stems from uncertainty, progress, development, ambition, intelligence, and hope. It is the base of all learning and the engine of every breakthrough. Curiosity hasn’t killed us—it has made us stronger.
Like comparing apples and oranges.
As Chuck Klosterman pointed out in Sex, Drugs, and Cocoa Puffs: “Apples and oranges aren’t that different, really. I mean, they’re both fruit. Their weight is extremely similar. They both contain acidic elements. They’re both roughly spherical. They serve the same social purpose. With the possible exception of a tangerine, I can’t think of anything more similar to an orange than an apple.”
So, yes, not only is the phrase literally unjustified, but there’s something else, too, something more sinister. Our brains’ ability to create analogies is, as Douglas Hofstadter and Emmanuel Sander convincingly argue in Surfaces and Essences, the foundational component to cognition. When confronted with something new, one’s mind will immediately and unconsciously compare it to anything similar from the archive of experience—and this is the case with the most infinitesimal details, too, not merely big issues. The suggestion, then, that any comparison, no matter how seemingly strained, is somehow faulty because, as the clichéd intimates, one should only contrast things that share qualities, is not only stupid in its literal content but fundamentally anti-human. To state that someone shouldn’t compare apples to oranges is basically to tell them never to learn anything, and, worse, to distrust the part of their mind that makes these analogies instinctively, and for good reason.
Don’t cry over spilled milk.
This is another one of those just-accept-your-lot-in-life kinds of idioms. The ostensible meaning here is that complaining is useless in unchangeable situations. First of all, in order to say this with a straight face and a clean conscience, one must understand which situations can’t be changed. Otherwise, you’re just telling someone to shut up. What, for instance, would early civil rights leaders and LGBTQI activists say to such condescension? Spilled milk is supposedly a trivial occurrence, but would it be so minor if it were a child’s only milk? What if the family couldn’t afford to replace it that day and the kid has to go without milk? The child’s tears, in this case, would not be brattish or privileged but tragic, and instead of going around telling people what they should or should not complain about, maybe it would be more useful to listen to the cries and try to help.
Every dark cloud has a silver lining.
Another seemingly positive sentiment with chilly undercurrents. It implies that all tragedy has some beneficial component that should be focused on. Though sometimes it is undoubtedly useful to turn away from a horrific or unfair situation and place your attention on some unlikely but positive outcome, it is just as often right to directly confront the ‘dark cloud,’ even if it means standing outside in the rain. Many tragic events benefit no one in this way, only the perpetrators of the initial calamity, or not even them: what, again, is the silver lining of genocide? It is a privilege to assume survival, to suggest goodness from badness, but it’s not always that way, and to dismiss the original pain like this just seems unnecessarily mean-spirited. Sometimes dark clouds have no silver lining whatsoever, no light behind it highlighted its edges—sometimes it just storms, and all we can do is take whatever shelter is available, and suffer its torrents.
Live by the sword, die by the sword.
This phrase, to me, appears often when a person is killed in a way that befits their lifestyle choices, as in a drug dealer being shot, or a daredevil failing to complete a dangerous jump. What seems to be suggested is not the notion that the way we live may relate to the way we die but something dubiously close but very different: these victims, if not quite deserve, at least were aware of the risks involved, which somehow is meant to make us less sad over the loss. As if that means anything other than, to invoke another cliché, you play with fire, you will get burned. In less tactful usage, this cliché seems to make people feel about the order of things, as if, by employing this notion, they feel safe knowing that they won’t die a similar death because they avoid such lifestyles. It is a way of distancing one’s self from everyday tragedies, and to place retroactive blame, to borrow poet Susan Eisenberg’s phrase, on those failures, the dead.
A chain is only as strong as its weakest link.
As a mere descriptor, this phrase isn’t wholly off the mark. But when used this is usually a criticism of the so-called “weakest link,” even though the very idea of a chain—that is, any interlinked system—undercuts such criticism. After all, any link in a chain must be responsible for all links, if a functional chain is the end goal, since all links are equal, a “weak” one can’t be inherently or irrevocably feeble by its own faults, but is, rather, the result of the chain’s effect on its capacity. In other words, how can we know “weakness” without defining “strength”? And if we’re defining strength in terms of the standard capacity of a given link, then the “weak” is only so judged in comparison, one based on an original definition about which it had no input, and a comparison it did not choose.
Featured illustration: retrorocket/Shutterstock.com