The Need for Factual Fiction

Editor’s Note: Last week I wrote a blog post that touched on the relationship between fact and fiction in Sony’s controversial film, The Interview and in another film from 75 years ago, Orson Welles’ Citizen Kane. This week I am honored to present this guest post by Alton Gansky, an accomplished author of more than 40 books and the director of the Blue Ridge Mountains Christian Writers Conference. Gansky also examines the relationship between fact and fiction, this time in the classic movie, Inherit the Wind. I think you will enjoy his perspective. 

By Alton Gansky

In January of 2015, Baker Books will release my latest nonfiction work, 30 Events That Shaped the Church. It comes on the heels of the 2014 release, 60 People Who Shaped the Church. Some are surprised to learn that I write book-length nonfiction. True most of my books are novels but I also enjoy and see great value in producing nonfiction books as well.

While preparing 30 Events I went through a long list of possible topics. In the end, one chapter caught my attention and so infiltrated my mind that I’m still researching it long after I turned the manuscript in. As I worked through the centuries I came upon a week long event that most of us have heard of but few of us know much about: The Scopes “Monkey” Trial of 1925. When I research I try to keep my mind free of bias, which is a difficult thing to do. Still, I thought I knew a fair amount about the “Trial of the Century.” I didn’t.

Part of my preparation was to watch an old movie (1960), based on an older stage play by Jerome Lawrence and Robert E. Lee, Inherit the Wind. I remember it being one of the best movies ever made, made all the more memorable by actors like Spencer Tracy and Fredric March, as well as Gene Kelly (no dancing in this movie), and Dick York (later of Bewitched fame) who portrayed John T. Scopes (Bertram T. Cates in the movie). This time, I watched the movie with a critical eye and was surprised how far they had strayed from the truth.

To be fair, Lawrence and Lee, as well as director Stanley Kramer, went out to their way to alert viewers that they were watching a movie, not a documentary. The movie begins with these words:

Inherit the Wind is not history. The events which took place in Dayton, Tennessee, during the scorching July of 1925 are clearly the genesis of this play. It has, however, an exodus entirely its own. [. . .] So Inherit the Wind does not pretend to be journalism. It is theatre.”

I appreciate the honesty of the writers. Still—and this is the problem with some types of fiction—many took the events as historical fact. To this day, people who have seen the movie think:

William Jennings Bryan was a glutton. (He was a diabetic on a very strict diet at the time of the trial.)

Clarence Darrow crushed Bryan’s beliefs as the latter sat in the witness stand. (Darrow ridicule people of faith but it had no impact on Bryan.)

That Bryan was a buffoon. (He ran for president three times, was a great orator, served as Secretary of State, and was a gifted writer).

That the townspeople of Dayton wanted to hang Scopes from a tree. (Nothing of the sort happened.)

And that Bryan died in the courtroom, the victim of Darrow’s grueling examination and ridicule of biblical stores. (Bryan died five days later from complications of diabetes. He remained active in the days following the trial.)

When I was in college, my psychology professor told the class that the human mind has trouble distinguishing between reality and fiction. It is the reason we jump in scary movies or tear up reading a sad scene.

All of this to say, that we as author’s of fiction need to take care how we represent figures and events in history. Lawrence and Lee went so far as to change the name of the characters (although they also went out of their way to make the actors look like William Jennings Bryan and Clarence Darrow). Despite their efforts, fifty-four years after the movie (longer for the play the preceded it) people still think the movie is trustworthy history.

This realization puts a burden of responsibility on the shoulders of novelists. While the novelist’s goal is to entertain, we in the Christian market also want to edify and to do so we need to be as accurate as we can be when portraying real people.

William Jennings Bryan’s reputation and work was sullied by the play and later the movie, despite the authors’ and director’s efforts to make clear their story was only loosely drawn from the real 1925 court case. Nonetheless, many have taken the fiction and see it as fact.

We novelist take some needed liberties in our creation, but when it involves real people from the past (or worse, vaguely disguised characterizations of living people), then we run the risk of doing harm.

 

Alton Gansky has written over 40 books of fiction and nonfiction. His latest work 30 Events That Shaped the ChurchLearning from scandal, intrigue, war, and revival releases mid January 2015. He is also the director of the Blue Ridge Mountains Christian Writers Conference. www.altongansky.com – See more at: http://altongansky.typepad.com/writersconferences/2014/12/the-need-for-factual-fiction.html#sthash.dBW1l0nG.dpuf

 

 

Sony’s The Interview, Citizen Kane, and the Power of Story

The controversy over Sony’s film The Interview and the hacking attack the company endured in response to it illustrates a principle I teach every day as a literature professor—the Power of Story. It shows how a fictional narrative that on the surface does no harm to anyone can still be perceived as such a threat that people will go to extreme lengths to stop it. The incident also reminds me of another instance when powerful people tried to squelch a movie they saw as a threat—Orson Welles’ Citizen Kane in 1941.

First, a few thoughts about Sony. Consider what triggered the devastating attack that cost the company tens of millions of dollars and brought turmoil to the lives of studio executives, actors, theater owners, and many others.

Was it a dangerous new weapons systems pointed at North Korea?

Was it a new round of economic sanctions that caused suffering for that nation or its leaders?

No. It was a story. Not even a true story, but a silly, unremarkable comedy that without the attacks that accompanied it probably would have been quickly forgotten.

 Why Not Simply Ignore the Film?

Some have asked why a dictator or anyone else would care about such a frivolous piece of entertainment like The Interview. Why not just ignore it?

Maybe those who hacked Sony fear the movie because they know, as I teach in my college literature courses every semester, that stories—whether novels, films, plays or other genres—are far more than “entertainment.” They often shape our perceptions and shape us even more than “reality” does. Stories may inspire, thrill, challenge, and teach, but they also may threaten.

Think of the most powerful films or novels you have seen and read. Aren’t those stories as moving and life-changing as any “real” incident you have experienced or heard about? Think of how stories have shaped your perceptions of places you’ll never visit, historical periods that otherwise would only be hazy in your mind, and aspects of human experience into which you otherwise would never have delved. For example, when I think of what I know about the World War II era and where that knowledge and perceptions came from, I have to acknowledge that far more of it came from fictional stories about the era than from my direct reading of history.

Stories move us and shape our view of reality. So it makes sense that a dictator would believe that the world’s perception of him might be shaped by this film, even if it’s an over-the-top comedy.

Citizen Kane as an Example of a Film that Defined a Real Person

Controversy over another film more than half a century ago shows just how powerful a movie can be in shaping the public’s perception of a real person. In 1941 RKO released Orson Welles’ movie, Citizen Kane, which some scholars have labeled the greatest film ever. The movie is loosely based on the life of William Randolph Hearst, whose vast publishing empire made him one of the most powerful cultural figures in America in the first half of the twentieth century.

The movie gives an unflattering (but in my view, not entirely unflattering) portrait of Hearst and his mistress. Knowing the power of story, since that is how he made his living, Hearst and his allies fought the release of the film by pressuring distributors not to make it available to theaters, by ignoring it in the pages of their newspapers, and by other threatening tactics.

As with the hacking campaign against The Interview, the campaign against Citizen Kane was only partially successful in suppressing it. The controversy itself made people want to see the movie, and it did reach the public in a limited way. It got good reviews, but then it essentially disappeared from public view in 1942 and did not emerge again into the public consciousness again until the mid-1950s, when RKO sold it to television.

Is Kane Hearst? Is Hearst Kane?

That’s when something interesting happened. As the film gained popularity and exposure, the memory of the actual life of Hearst himself, who died in 1951, faded from public perception. David Nasaw, who published an excellent biography of Hearst in 2000 and showed what a fascinating and complex man he was, calls Citizen Kane a “cartoon-like caricature” of a man who was actually very different from Hearst.

However, over time, Nasaw writes, “the lines between the fictional and the real have become so blurred that today, almost sixty years after the film was made and a half-century since Hearst’s death, it is difficult to disentangle the intermingled portraits of Charles Foster Kane and William Randolph Hearst.”

In other words, the good story won out over reality. Even though people may be told that Hearst and Kane are different, and even though talented biographers like Nasaw might try to set the record straight, when you mention Hearst to most people, they’ll think of Citizen Kane.

The hacking attack against Sony is reprehensible, but the North Korean dictator may be correct about one thing: the fictional story may be how most people remember him.

Will Novels, Movies and Video Games All Blend Into One?

Is the day soon coming, or has it already arrived, when consumers won’t see much difference between reading a novel, watching a movie, and playing a video game?

Over the past year, I have seen lots of evidence that the boundaries that used to separate these and other categories are breaking down.

For example, until recently, if you planned to read a celebrity’s autobiography, that meant you went out and bought a book, which you would read page-by-page as the author reflected on his or her life.

Now, however, that is the old-fashioned way to do it. Today I saw an article about the actor Neil Patrick Harris’s autobiography, which takes a much different approach. It is an interactive autobiography, which shares similarities to a video game. The description of the “book” on Amazon.com asks, “Sick of deeply personal accounts written in the first person? Seeking an exciting, interactive read that puts the “u” back in “aUtobiography”?” The reader of Harris’s book doesn’t simply read about the actor’s life, but lives it: “You will be born to New Mexico. You will get your big break at an acting camp….Even better, at each critical juncture of your life you will choose how to proceed. You will decide whether to try out for Doogie Howser, M.D. You will decide whether to spend years struggling with your sexuality. You will decide what kind of caviar you want to eat on board Elton John’s yacht.”

All these choices have consequences for the reader: “Choose correctly and you’ll find fame, fortune, and true love. Choose incorrectly and you’ll find misery, heartbreak, and a hideous death by piranhas.” As if that were not enough, the book also contains recipes, a song, and magic tricks!

The Hobbit: Book, Movie, or Video Game?

Another example of category-blending that stands out to me is the most recent Hobbit movie, Desolation of Smaug. The category-blending I’m referring to is not the fact that I first experienced The Hobbit as a book, and now it is a series of films. Books and films are still separate categories. I am talking about the blending of categories within the film itself.

As I watched the movie, there were times when I couldn’t help but think I was actually experiencing a video game, especially in battle scenes that felt entirely different from anything I remember from the book. In one part, for example, dwarves rush down a raging river in barrels as orcs (many orcs) attack them and as elves attack the orcs. I half expected the elves to get 100 points per orc or dwarves to get bonus points for making it past certain barriers. It was an exciting scene, but it didn’t feel like a movie in those parts.

Many actions movies have that video game feel now, as bad guys (or creatures, or robots, or other villains) get wiped out in large numbers in battle sequences that seem to go on for a very long time. Think of the Transformers movies or Dawn of the Planet of the Apes or many others. Many scenes could be transferred almost directly into a video game.

As Movies Become Games, Games Become…Movies? Books?

Of course, as films become more like video games, many video games, with their more elaborate plots, complex characters, and lush and realistic visuals, now feel more like films. Or maybe it would be more accurate to say they have begun to resemble television series, like Breaking Bad or The Sopranos, with storylines that extend over longer periods and characters that can become as familiar as the real people in our lives.

That depth of character and plot sophistication found in recent TV series such as Mad Men and Downton Abbey remind many viewers and readers of yet another category of storytelling, the novel.

“Reality” now merely another story category

Now, even the category known as “reality” is breaking down. I don’t mean reality television, which is its own category-blending genre, but I am talking about real life itself. It used to be that video games copied reality. You played a game to feel what it was like to fight in a battle, or race a car around a track, or ski down a slope. Now experiences are being created to reverse that, in other words to bring the thrill of video games into real life.

The New York Times reported this summer on an experience called Escape Rooms, in which people are trapped together in a room and are given clues and puzzles and codes to solve in order to escape. It’s a video-game-like experience, but without the video. You’re in a real room with real people, and you’re really trapped (although you’re eventually set free even if you don’t solve the clues).

Not everybody likes these trends. When some people go to a movie, for instance, they don’t want a video game stuck in the middle of it. They want their categories pure. On the other hand, there has never been time when people had more ways to enjoy storytelling in every imaginable form. My prediction is that as time goes on, the categories will break down even further, and more and more viewers/readers/players will come to expect the inventive techniques.

Why the Cell Phone May Save the Novel

I like to watch people’s reading habits when I’m at airports and on airplanes. During several recent flights, which included some lengthy layovers and delays, I noticed that not very many people were reading novels, at least not ones in the form of books made of paper. I didn’t see all that many people reading on tablets, e-readers, or laptop computers either. What I saw, more than anything else, were people reading on cell phones.

The people reading their cell phones were not necessarily reading novels, of course. I personally would not want to read a novel on a cell phone, nor would I want to watch a movie on one. The screen is too small and uncomfortable. But not everyone sees it that way. Readership studies show that many people do like to read books on cell phones, and the numbers are increasing.

How Americans Read

According to a Pew Research study of Americans’ reading habits, last year 32 percent of e-book readers 18 and older read books on their cell phones. That is a higher percentage than people who read e-books on a computer (29 percent). The highest percentage of readers still read e-books on e-readers (57 percent) and tablets (55 percent), and many people use multiple platforms.

As a novelist and professor of literature, I am very interested in the future of the novel and have written elsewhere (including in this article in APU Life magazine) about my concerns that its popularity may wane in this era in which readers are used to being entertained by shorter chunks of information such as Tweets, Facebook posts, blogs, and YouTube videos. Will readers who have grown used to skipping from post to post still have the desire, focus and stamina to work their way through a 350-page novel that contains nothing but words? Have we grown too distracted?

I still worry about that, but this trend of reading books on cell phones, more than any other trend, gives me hope for the novel. Why does the cell phone make so much difference? One reason is that, unlike the e-reader or even the tablet, most people almost always have their phone with them. If they get caught up in a novel, they might find themselves dipping into it at times when they otherwise would not be reading anything—in a doctor’s waiting room, in a line at the store, in an airport terminal. Potential readers who might never think to bother with a novel in other circumstances—such as going to a bookstore to buy one or messing with an e-reader—might be more likely to read one if they could easily access it from their phone.

How the Cell Phone Increases Reading Around the World

Another reason I think the cell phone-reading trend is good news for the novel is that this practice is even more prevalent in other countries, especially developing countries in places such as Africa, than in the United States. A study by the United Nations organization, UNESCO, showed that 62 percent of people in developing countries now read more because they are able to do so on cell phones. In many countries covered in the study, such as Ethiopia, Ghana, India, Nigeria and others, physical books are prohibitively expensive, while open-access books cost as little as 2 or 3 cents each.

The UNESCO study showed that about a third of readers in these countries use their phones for reading books, and about 80 percent of the population has access to cell phones. What kinds of books are they reading? The study showed that the most popular genre was romance novels.

In addition to the countries covered by the UNESCO study, other nations also have a large number of people reading books on cell phones. One report indicates that more than 25 million people in China read books only on their cell phones.

I believe this trend in reading habits worldwide will not only help the popularity of the novel, but will also lead to changes in the novel itself. The novel of the future may look significantly different from novels written in earlier eras. I plan to comment on that issue in a future blog post.

OJ Simpson? Never Heard of Him, Or Johnny Carson Either

When I started seeing the headlines and news segments marking the 20th anniversary of the OJ Simpson murder trial, my first thought was that the whole tawdry saga still felt too recent to be wrapped in nostalgia. My next thought was that, as a college professor, I have seen a big shift over those twenty years in how students perceive the OJ Simpson case.

In the first few years immediately following our culture’s fascination with the Bronco chase, bloody gloves, Johnnie Cochran, Marcia Clark, Judge Ito, and all the rest of it, I could refer to the Simpson case any time I needed an example of an event that captured the attention of an entire culture, an event that you couldn’t get away from even if you wanted to, and that everyone seemed to have an opinion on.

I teach literature, and in one course we read some literary works that sprang from another “trial of the century” about a hundred years earlier. That was when Lizzie Borden either did or did not take an axe and murder her parents in their home in Fall River, Massachusetts. Borden, like Simpson, was acquitted, even though many people thought her guilty. The Lizzie Borden case still has a big following (the home where the murder happened is now a hotel that caters to fans), and many movies, short stories, articles, and other works have been devoted to it. Why?

It’s just like the OJ Simpson case, I used to say. Why did everybody want to watch it? Why was the trial carried on so many TV stations? Why was it the talk of the nation? Students could immediately see the connection.

Recently, however, teaching the same literature course at the same university, I tried to use the Simpson case as an example, and all I got were blank stares. OJ Simpson? Some students had a vague idea who he was, but not one knew anything about the case.

The 20th Century as Ancient History

The Simpson case is only one of many twentieth century references I have had to drop. A 20-year anniversary of anything means that it happened Continue reading

The Best Five Answers: If You Could Improve Your Life in One Way, What Would It Be?

What one thing would most improve your life? More money? Better health? More sleep? A dog? Two brains? More motivation? More exercise? More time? Better relationships?

Those were among the answers I received to this week’s question. Other people focused on things they would like to delete from their lives rather than on anything they might add. They wanted to get rid of stress, eliminate self-criticism, or abolish peach-flavored yogurt (see below).

I invite you to read the Best Five Answers to this week’s question, and please take a moment to answer the new question at the end.

If you could improve your life in one way, what would it be?

5. “If I could improve my life in one way it would be the elimination of the peach flavor from the Yoplait Yogurt variety pack at Costco. I have a stack of that damn peach flavor in the back of my fridge. I can’t get a new box until I finish them, and I just can’t stand the sight of them.”

–Joey Smith

4. “You know, this is a question I find myself pondering every now and then, particularly when I’m having a bad day for whatever reason. And every time I do I put together a mental list of things that I think might somehow make my life better. And yet I always end up reaching the same ultimate conclusion: I’ve got a family that loves me, I grew up to have the career I wanted ever since I was a kid, I’ve got friends who accept me for who I am, I’ve got a roof over my head and I’m able to pay the bills (late sometimes, sure, but they do get paid). So really, what is there to improve?”

–John Small

3. “To re-engineer this 53-year-old body to an 18-years-of-age body and take it back through the fun times I had to get it to the shape it’s in today.”

–Terry J. Robichaud

2. “Only do the things where I add the most value, and delegate the rest!”

–Bruce W. Martin

1. “Let go, stop worrying, and finally accept that this is only a journey to our final home.”

–Jerry Vachon

Please answer next week’s question in the comments section:

Would Jesus use Facebook?

 

 

The Best Five Answers: What Time Period Do You Wish You Had Been Born In, and Why?

I have always felt that I was born in the wrong era. Like the main character in the film, Midnight in Paris, I always felt I would have fit in better in the era of some of the literary geniuses I admire from the 1920s and ‘30s—writers like Thomas Wolfe, Ernest Hemingway, F. Scott Fitzgerald, and Gertrude Stein. This week’s question was:

What time period do you wish you had been born in, and why?

I received some fascinating responses to this, and it was hard to choose the Best Five Answers. A surprising number of respondents said they are content with the era they were born in. The 1800s was a popular era, and so was the future. Will Cook would have been happy to have been born in 2125, when he imagines he would have a “half-synthetic, half-flesh body,” and Kay Smith would like to have been born in an unspecified time in the future “when women are treated as fully human in the church.”

With many great responses to choose from, here are the Best Five Answers, followed by next week’s question:

5. “16,000 BC in current day location of Cyprus. Pretty sure I could have walked to Atlantis and witnessed the true precursor to the pyramid civilizations before global meltdown and flooding covered them. (Yes, I am serious and, no, I’m not a nut!).”

–Robert Green

4. “The same period I was born in, 1983. The reason being is I’ve gotten to see so much technological changes/innovations as I’ve grown up. One of the first things I learned to do was operate a record player. CD’s took off and now digital downloads are the norm. Same goes with music videos, used to VH1 or MTV was the go to place for that, now you can pull up YouTube and see just about any video you want. The Internet also has made it easier to stay in touch, reconnect or make new friends.”

–Nathan Webb

3. “Before watching Midnight in Paris, I probably would have answered the question with the 1840s because of the music, fashion, and historical events. After watching Midnight in Paris I realized, like Owen Wilson’s character, that I wouldn’t want to be born at any other time than I was. He rightly says something along the lines that the present is unsatisfying because life is unsatisfying. This, right now, is my Golden Age.”

–Sara Flores

2. “I wish I had been born in the 2200s because then I would probably be able to teleport. But hopefully the world’s resources wouldn’t be decimated by then….”

–Abbi Mleziva

1. “I would be born in 1935, I would be but a child as the war swelled and then ebbed, just old enough to have been able to look up over London into a rumbling sky filled complete by thousands of USAAF B17 bombers, each guided by diesel propellers leaving four elegant streams of blue trailing behind the formation. Fighter escorts aligned like geese surround the bombers top, bottom, and side. The entire earth would rumble as countless thousands of steel bombers and fighters ripped through the grey London skies on towards Germany–the might of American economies of scale and mass production all slowly growling out over the English Channel to break the back of the Axis. Minutes pass and finally the sky would be empty again for hours before evening when the steel birds would come limping back overhead, bombless and bleeding black smoke. These thunderous fleets of aircraft will never again be witnessed—technology raced ahead so quickly that war waged in the skies is now invisible and supersonic and remote. Men don’t take to the sky by the countless thousands lined as far as they eye can see now. And this I lament because of its ephemeralness. It must have been a strangely harrowing sight to peer up, nine years old, bright blue eyed, and see no sky but only smoke and steel.”

–Brian Kraft

Now I invite you to respond to the question for next week:

If you could improve your life in one way, what would it be?

The Best Five Answers: What is the Best Way You Have Found to Handle Disappointment?

Let’s get one thing settled from the start. According to the answers I received this week, can you guess which treat was mentioned most often as disappointment comfort food?

Ice cream.

I am happy to launch the first post in a new blog feature called “The Best Five Answers.” Each week (for now) I plan to ask a question and seek answers from friends, students, blog readers, strangers, and anyone else who is interested in chiming in. Then I will choose the best five answers and publish them on the blog.

If you would like to participate, please see next week’s question at the end of this post.

The first question in the series was, “What is the best way you have found to handle disappointment?”

I loved the answers I received, and many of them made me rethink my own behavior in the face of disappointment. Here are the Best Five Answers:

5. “I remember that I don’t ‘deserve’ anything. Everything in life is a gift, even just the fleeting chances.”

–Charles Crowley

4. “The best way I have found of handling disappointment is to journal. When I’m really disappointed I don’t like to talk about my feelings, but journaling seems to help me make sense of things and process what happens. It’s like a silent prayer and helps bring healing.”

–Alyssa Martin

3. “I handle disappointment by praying about it and talking it out with those around me that I consider my support team. Disappointments are always difficult to take in the moment and I need to express that to someone to avoid inner build-up, but in the long run what I often find is that God had some other plan for me. That plan is usually considerably better, so when faced with situations where things go awry I try to remind myself of all my good ‘disappointments.’”

–Anna Christensen

2. “The best way I have found of handling disappointment is not to dwell on it and to accept that there are events in this world that I can’t control. Letting disappointment engulf your thoughts ruins your ability to enjoy the next thing life throws at you. I’ve endured the most extreme disappointments and I’m probably the happiest person you’ll ever meet.”

–Karly Adair

1. “Throw a short pity party for myself, eat something sinful, and then give thanks for the blessings that I have. Even with big disappointments, my glass is still way over 50% full.”

–Kenneth Litwak

Thanks to all who responded and to my Facebook friend Dennis Skarvan, who alerted me to the photo of Cape Disappointment, in Washington.

Now I invite you to respond to the question for next week:

What time period do you wish you had been born in, and why?

Quit Griping that “Everybody Gets a Trophy”

I’m tired of hearing about the “Everybody Gets a Trophy” generation. When I recently heard someone use that phrase again, I wondered, was it just my imagination, or were people constantly using that cliché to describe today’s generation in their teens and twenties?

I Googled “everybody gets a trophy” and came up with nearly a million articles, blogs, news stories and other items that use the phrase, so I guess I’m not imagining it. I read through some of the endless news commentaries and blog posts about this “syndrome,” as some of them call it, and I ended up even less convinced about it than I was before.

The basic concept is this: These young people are the spoiled products of a self-esteem culture in which they (or at least their parents) are afraid of failure. In order to prevent these delicate egos from facing any hint of mediocrity or failure, their parents put them on soccer teams and baseball teams and other activities in which everybody gets a trophy regardless of any lack of talent, achievement, or actual victory over the other team.

Because of this, the kids grow up thinking they’re far more talented than they really are, and they expect unqualified approval in every area, from academics to sports to the world of employment. Coddled and arrogant, they fail to learn how tough life really is. Their weakness of character erodes our entire culture, but someday they’re in for a rude awakening.

What utter nonsense.

As a college professor, I have spent years interacting with people in this generation. As a parent of teenagers who play sports, I have spent years watching what happens with the trophies. I simply don’t buy the usual “everybody gets a trophy” analysis.

Let’s start with the trophies themselves. My son and daughter have played on many teams in several different leagues over the years. They have played competitive softball, baseball, basketball, football, soccer, and probably a few other sports I’m forgetting. They do get trophies in most of those, or medals, which amount to the same thing. On most of the teams it is true that everybody gets one of these, win or lose. The trophy or medal is bigger or better if the team wins the championship, but everybody gets something regardless of the outcome.

The thing is, kids are smart about these things. I have never seen my kids or any of the other athletes interpret these medals or trophies as signs that they are all winners or that the loss of a game or championship is somehow not a failure. They understand failure. They know who the good players are and who the bad players are. If they’re not as good as the other athletes, it certainly doesn’t take the withholding of a trophy to make that clear to them. They know. Their teammates will make it clear to them in many ways, and so will the coaches, and so will the spectators. It’s absurd to think that a trophy or lack of a trophy changes that.

What, then, is the purpose of giving a medal or trophy to everyone? My own kids have their medals hanging in their rooms and the trophies are displayed on shelves around the house. These are not signs of egotistical triumph. They are mementoes of being on the team. My daughter also has her softball caps from her various softball teams displayed on the wall on one side of her room. They’re a way of remembering the experience.

I once worked at a magazine devoted to the sport of trapshooting. At a national tournament, I worked at the booth where we gave out metal pins that commemorated the event. Most tournaments gave out such pins, and competitors would attach those to their caps or shooting jackets as a way of showing how many tournaments they competed in. They were eager to get those pins, and when we ran out of them one day, they were angry until we got some more. They were meaningful, but they did not signify success. They were simply a souvenir. That’s the way it is with the trophies.

The men and women who coached my kids were certainly not interested only in the athletes’ self-esteem. They worked them hard. They taught them. They punished them. They encouraged them. They wanted them to win. My kids have plenty of trophies, but they have also tasted plenty of failure.

Regardless of what generation we’re in, life offers abundant lessons on how to handle failure. I wouldn’t be too worried about a few extra trophies being handed out.

What I Wish Someone Had Taught Me About Writing

What is the best way to approach a writing task, whether as a professional writer or a student? Do you procrastinate until the last minute and then start writing on page one and hope for the best? Or is there a better approach? My friend and APU colleague, Tom Allbaugh, confronts that problem in a very helpful guest post this week. Dr. Allbaugh is an accomplished writer who is celebrating the release of the second edition of his excellent writing textbook, Pretexts for Writing. I think you will enjoy what he has to say.  

What I Wish Someone Had Taught Me About Writing

By Tom Allbaugh

In the first chapter of Pretexts for Writing, I tell a story about when I was a student in a freshman writing class. I tell of how I waited, like most students I see today, until one or two nights before the deadline to get started on my research paper. Even though this was 1974 and I had to write on an old typewriter, I pretty much started by sitting down to write what I hoped would be the final draft.

Teachers call this “top-down writing.” We see it all the time in the movies. The writer starts typing without planning, hoping that inspiration will show up. In the movies, of course, the writer becomes rich and famous. In real life—in my life, for that first college assignment—I struggled to complete six pages. I didn’t even think about my main point until well into that “final” draft.

Many of my students have told me that they like that I tell this story. They say that it helps them connect with my ideas. I’m glad that my plan to demonstrate an idea also serves the second function of connecting with my audience.

Why Didn’t Someone Tell Me?

Today, I do often wish that someone had taught me that writing needs to be planned. A plan can be simple and personal, but it will usually involve us in generating ideas, thinking about genre, and making audience considerations.

The writers I know or have read about in interviews sometimes discuss their composing process, and their approaches can be idiosyncratic. We know, for example, that C.S. Lewis took long walks. Beethoven did this also, planning his works as he went. Looking at his fragmented writing in his notebooks, with his scratched out notes and revised ideas, anyone can see the years of work it took him to sketch out his symphonies. Some have suggested that it took this composer a lot of digging to connect with his unconscious. Getting the unconscious into the writing act is perhaps what prompted Ernest Hemingway to stand at his typewriter at chest level and Mark Twain and Truman Capote to both write lying down.

Especially among creative thinkers, planning usually has this “mental” element to it, but it will also allow writers into the more conscious work of considering the kind of writing being attempted and who their audience is.

When I started out in college, I wouldn’t have thought like this. At eighteen, I worked from the belief that writing an argument or a research paper or a novel required only inspiration and self-expression. This is also probably why the research paper task always seemed so daunting to me. None of my teachers ever told me that I should probably plan what to write about. As early as the fifth grade, I was told about revision and that I should write an outline. But outlining is an organizing strategy and, suspiciously, does not always allow for other kinds of planning.

What I Know Now

Today, even in those rare instances when I get inspiration, I still know enough to allow myself time to generate more thoughts before I start. The planning can vary—brainstorming, free-writing, or conversation will work—depending on what I am writing. There’s much room for variation. Probably the only exception to this rule is when I write a journal entry.

But this is what I wish I’d been taught from the very start. So I have organized Pretexts for Writing to begin with planning, with what writing and speech teachers since Aristotle have called “invention.” This opening, I hope, will encourage thinking about different aspects of planning.

Thomas Edison is supposed to have said that his work involved about 5% inspiration and 95% perspiration. I may be off on his numbers just a little, but his point is clear. Inspiration is over-rated. But just getting to work and making some plans, I can usually encourage and generate some inspiration.