December 2, 2016
Remember the good old days when strict gatekeepers had total control over what got published? It wasn’t all that long ago, when you think about it. Some of us still have scars from trying to crash those gates. Getting slapped down by relentless rejection was bad enough, but what about all the rules that these guardians of literature enforced? If we were fortunate enough to get replies from them at all, we’d receive a dressing down about all the strictures we had violated in our pitiful attempts to write. We would not merit a second glance, much less true consideration, until we mastered the various commandments they had set up to keep the barriers in place.
There were various telltale signs, the gatekeepers said, that pegged you as an amateur. One of these was overuse of adjectives and adverbs. Well, maybe the occasional odd adjective could be allowed, they conceded, but adverbs were strictly verboten. Likewise, if a character said something, he or she could only say it. No declaring or exclaiming or expostulating. And no exclamation points, ever! Above all, we must avoid overwriting and exposition, two deadly sins that went hand in hand. But even the masters do that, I might protest. What about John Updike with his multiple metaphors, and Pat Conroy with his lush descriptions? But those are famous guys, I was told. They get to play by their own rules, or none at all. Someone without a name had to grab the reader’s attention on the first page, if not the first sentence, since even in those pre-social media days, there were distractions at every turn to keep people from reading. In fact, the average attention span was so minuscule that if a prospective reader didn’t get instantly hooked, he couldn’t be blamed for turning on the TV rather than proceeding to page two.
Prior to the advent of self-publishing, when it seemed next to impossible for a beginner to publish a novel, I tried my hand at a few short stories. It proved equally difficult to penetrate any but the most amateur story markets, the type that paid in copies of the magazine. I decided the only way to get beyond the endless “not suitable for us, but good luck” responses was to pay for advice from a reputable source. I sent several stories to a critique service run by the editor of a well-regarded literary magazine. She tore them to shreds, although not without offering tidbits of encouragement here and there. She offered to go on working with me, since I showed a hint of promise. I didn’t take her up on that offer, since I soon gave up trying to write stories and instead decided to work on novels, which would at least offer a greater reward if successful. My critique contact likewise gave up criticizing stories, saying she was inundated with too many bad ones. She, too, decided to concentrate on longer manuscripts.
I recently reread one of the stories I submitted to this service. It was called “Cheryl’s Lunch Hour,” and was based on what I thought was a clever, if somewhat implausible plot twist. It centers on a Federal government secretary in her mid-twenties, who is mentoring her sister, twenty-year-old Rosie, a gifted dancer. Both girls live with their parents, who are fairly unimaginative about the sisters’ goals. Cheryl frequently uses her lunch hour to take an arduous trip from her Washington office to the Maryland suburbs to watch Rosie rehearse for an upcoming small theater production. She believes in her sister, although she is embittered by her comparatively boring job and a jealous streak she can’t quite suppress, since she once had theatrical aspirations herself.
One day, Cheryl runs into a talent scout outside the theater who mistakes her for her sister. For a fleeting minute or two, Cheryl wonders if she can utilize this instance of mistaken identity for her benefit. Maybe if she dieted and practiced, she could dance again herself, with perhaps less skill but with a depth of maturity that her sister does not yet possess. Her head explodes with dreams. Could she possibly carry off this deception? By the time she gets back to her job (which she had had dreams of quitting on the spot), she’s shaken off the fantasy and resumes her main chore of typing spreadsheets.
No doubt I put both the story and the critique aside for many years because the criticism was harsh. Was my critic too enamored with little rules? Perhaps, as she jumped on every instance of sentences close together that repeated common words and phrases such as “she,” “the,” “it is,” and “there was.” She pointed out the redundancy of sentences such as “She scolded loudly.” She denounced as a cliche the opening device I used of the main character waking up in the morning with yesterday’s problems swirling through her head.
I violated plenty of bigger rules as well. I was scolded for having no hint of the central story problem on page one. In my critic’s judgment, the story lacked a sympathetic viewpoint character, since “Cheryl is jealous of her sister, yet wants to use her to feel good about herself.” But she’s not a total bitch, I protested, just discontent with her life and understandably envious of her talented, younger sister. Is she any less sympathetic than the pedestrian parents or the arrogant, oblivious Rosie? It seemed the critic found all of them pretty despicable, except perhaps the superior at Cheryl’s office who has agreed numerous times to cover for her during her prolonged lunch hours. Even he has an ulterior motive … to start a relationship with her. “Central characters and villains shouldn’t be all good or all bad,” my critic lectured, rather obviously. Additionally, what I considered the clever trick of the tale, the case of mistaken identity, was judged to be unrealistic. Maybe so, I thought, but for crying out loud, it’s fiction.
It got worse. Most painful of all was the critic’s judgment that the prose tended to be “unnatural in both narration and dialogue.” My heroine was in the habit of delivering long monologues. For example, she sarcastically describes her State Department job to her father, a higher-up in the department: “You may think all I do is type operating budgets for the Weapons Evaluation Division. But I have the secret knowledge that keeps the operation going. Nobody else in that office knows how to set up charts on the computer. You think it’s easy getting those huge numbers to fall into neat columns? If I were to quit today, I don’t know what you administrators at Main State would do without your nifty charts. And if you didn’t have those numbers always at hand to feed to the negotiators, what would they bargain with? That might be the end of any hopes for world peace in our time!”
Have I gotten better at this stuff over the years? I knew even then that my critic made excellent points. It was kind of her, after all that, to find a shred of hope for me as a writer. Nowadays, I choose to look on the bright side. If I cringe at my earlier efforts, it must mean I’ve improved, at least a little.
November 2, 2016
I’m a lifelong East Coast girl who finally got around to visiting California in June 2016. My previous travels took me as far east as central Europe, but I had somehow neglected to take the westward trek in my own country until a full two years after retirement. Los Angeles was an important goal on my bucket list, mainly because of my love for movies and my interest in the business aspects of movie-making. Also, I’ve been making a fairly desperate and pathetic effort to buy my way into the industry by paying professional screenwriters to convert my four novels into scripts. Having waited so long to see the city of my dreams, I went there with stars in my eyes, determined to soak up as much glamour and creative energy as I could.
Warner Bros and Paramount were major sites on my wish list, since they advertise themselves as working studios rather than mere theme parks. What struck me immediately was that they are, indeed, workplaces. You can tell that sound stages, when they’re not in use, are the province of crews. Highly skilled technicians are required to work all those overhead lights and wires and microphones. Besides the stages, there are rooms full of props that are being collected for possible use in upcoming films. Those that have already been earmarked for a project are tagged and copyright-protected from being photographed. Someone has to oversee these cavernous rooms, which were not well air-conditioned on a hot day. Overall, you get a feel not for glamour, but for the real labor behind the scenes. It hardly seems fair that the actors get to memorize their lines in the comfort of their palatial homes, and then swoop in at filming time to scoop up all the accolades and applause.
This feeling that LA is a hard-working city, and not just a partying hub, was enhanced by the fact that it was hovering around 100 degrees the day I hit the studios, easily the hottest day of the year there. Much of the tour is necessarily outside, as an open-air trolley is used to transport visitors in between lots. You’re not allowed to enter places where the “filming in progress” lights are on, which limits your options to get relief. Luckily, the tour directors had the foresight to set up free water at several stops.
It was not only a hot city that day, but a smoky one, with fire bellowing out of the nearby hills. A little smoke doesn’t bother the residents until it threatens to get out of hand, which tends to happen later in the summer. Likewise, the earthquake that hit San Diego a few days before I visited there didn’t cause much concern, although it was almost as strong as the one that set off major panic on the east coast about five years ago. It wasn’t the Big One, so it wasn’t that big of a deal. As for driving in LA, there are memorable songs about its roadways. I can’t vouch for everything in Sheryl Crow’s description of all-night partying in LA, which she tops off with the chorus, “All I wanna do is have some fun till the sun comes up over Santa Monica Boulevard.” But no driver in LA can deny that Burt Bacharach spoke the truth in his song “Do You Know the Way to San Jose” when he proclaimed, “LA is a great big freeway.”
My trip also featured a tour of movie star homes, although most of them are hidden behind extremely tall hedges. Once in a while you can peek through the foliage and catch a glimpse of a landscaper or gardener. There’s no question Beverly Hills is one of the wealthiest neighborhoods I’ll ever see, yet it’s not all that different from the nicest parts of Bethesda, Maryland or McLean, Virginia. Somehow the east coast seems more modest, since the residents don’t go out of their way to hide from prying eyes, and can even on occasion be seen doing their own lawn work. To be fair, it must be much more difficult to keep up a huge lawn in that dry southern California climate at the height of summer, where the grass is practically tumbleweed.
I guess it all goes to prove that Hollywood is a vibrant place, but hardly magical. We idealize the people who work there without always considering how workaday their lives can be. For example, our young tour guide at Paramount Pictures, whom you might expect to be star struck, is working multiple jobs in order to pay off his humongous student loans. His long-range plan is to get involved in the business rather than the performing side of the industry. In the meantime, he conducts tours by day and reads screenplays for the studio by night. He doesn’t get to know many stars on the job, since they rarely have time to chat, so his stories about them are mostly hearsay.
Did I manage to glimpse any stars myself that day? Maybe future ones. Our tour guide pointed out the back door of a lot where a kid was being admitted to audition for a youth-oriented show. I could only imagine the striving that lays ahead for that ambitious youngster. If she manages to pass this first hurdle, there are so many more to come. All in all, I figure showbiz is a lot like writing, considering all the sweat it takes to make the end result look easy and fun.
October 1, 2016
I’m a music fan of the baby boomer generation, so how could I possibly resist writing a novel about a rock band? Handmaidens of Rock (2014) centers on a musical outfit that forms at a suburban Maryland high school like the one I graduated from in 1970. Before they can legitimately call themselves a band, the three members—lead guitarist Preston, keyboardist Neal, drummer Brad—must first prove they can hang together long enough to play a gig at a school dance. Once onstage, they must come up with a name on the spot, so they call themselves Homegrown. They amuse their classmates by mocking the local singing star they’re supposed to be backing up, mutilating the cheesy songs he attempts, such as “Love Potion Number Nine” and “Leaving on a Jet Plane.”
To that point, the story is perfectly recognizable and plausible. No doubt there were bands forming all around me at my high school, but since I wasn’t intimate with any of them, I had to make up one of my own. The late 1960s-early 1970s era was a time of improbable rock dreams. The music we were hearing on the radio provided plenty of inspiration to push the envelope of our placid suburban lives. Musically, at least, we could revel in free love, dream in psychedelic colors, and march the streets to demand an end to the Vietnam War and all forms of civil strife. Those songs became closer to true life as many of us moved on to college, the military, and other real-life experiences.
Startup bands have always been lucky even to get a taste of local fame. To make my imaginary band compelling, I had to portray it as more talented than most, or at least extraordinarily lucky. One way Homegrown distinguishes itself from the musical dregs is to pick up some classy groupies, the “handmaidens” of the title. Candy, Hope, and Theda have more going for them than a strong determination to ride the band’s coattails. They’re “handmaidens,” but with ambitions of their own. They aspire to be a journalist, a fashion designer, and an actress-musician respectively. One of them, conveniently, has a powerful attorney father with connections to the music industry.
Any band that aspires to long-term success must write its own songs. How could I get my musicians to do that realistically, when I’m not enough of a musician myself to hear original songs in my mind? One technique was to keep classic rock stations playing on my computer for inspiration. Listening to songs that were popular back in my day, I’d imagine my band trying to write similar tunes. For example, “Time of the Season,” a seductive tribute to the Summer of Love by the Zombies, turned into a piece by Homegrown called “Grooving under the Desk.” The Status Quo song “Pictures of Matchstick Men” used to pound in my head all the time, since I heard it daily on the cafeteria juke box in high school. My band’s take on this was a psychedelic sex dream called “Hot Teacher in Tights.” I always loved the Doors tune “Tell All the People,” a catchy but vague call to arms with shout-outs to youth that could mean almost anything (Set them free! Follow me down! See the wonder at your feet! Your life’s complete!) My take on that was “Revolution for Amateurs,” which might or might not be an actual call to revolution.
Sad songs were part of the band’s repertoire. My lead guitarist Preston, having lost his mother at an early age, mostly hides his feelings behind a hard exterior but occasionally exposes them in song. His heartbreaking “Signals from the Clouds” bears a resemblance to King Crimson’s “I Talk to the Wind.” Idealism is also part of the musicians’ mindset. In “Peace Conquers All,” they envision a new era of free love in the streets, irresistible to the public and cops alike, as in the Animals’ “Warm San Francisco Night.”
Fresh out of high school, my band makes an amateur mock-detective movie with a witchy theme song called “Hex” (something like a popular Cream song, “Strange Brew”). With that in the can, they start writing songs with feverish speed and come up with an eclectic album inspired by that same band’s classic, “Disraeli Gears.” Further adventures follow, including trips to England, Scotland, and California. Scotland proves the most fruitful in terms of new musical directions. They spend time in a commune run by a defrocked priest known to have harbored draft resisters. Their near-worship of him inspires a spate of religious-themed songs, like the one called “Peace Warrior,” inspired partly by Jethro Tull’s “Hymn 43” (with the same refrain, “Oh, Jesus, save me!”) and partly by the Animals’ “Sky Pilot.”
The band changes its name to AMO, which sounds more grownup, and tries to find itself. While attending UCLA, the musicians become involved in a rock festival that ends tragically. Ironically, this is the event that propels them to national fame. Despite their newfound notoriety, the effects of the violence are devastating enough to send them flying off in different directions. The girls break up with their respective musicians and move on to presumably more adult relationships. Still, the wildly creative and romantic ride they took as “handmaidens of rock” can’t be forgotten. A five-year reunion concert takes place in the same high school gym where they first made a jubilant mess of backing up a semi-famous singer. Preston, emerging from a turbulent and fallow period, experiences enough of a creative resurgence to come up with two new songs: one about his inner turmoil called “The Stranger Within” (a take-off on Traffic’s “Stranger to Himself”), and one that celebrates his new marriage to a free spirit, called “Free Spirit of the Road” (which somewhat resembles the Doors’ “Queen of the Highway”).
Assigning a genre to Handmaidens of Rock has been somewhat challenging. No doubt it can be called “chick lit” or “women’s fiction,” but how about “contemporary women’s fiction”? That is one of the more popular classifications these days, yet it doesn’t quite fit an early 1970s story. Some reviewers and advertisers have called the book “historical fiction.” That makes me feel ancient, since I remember the era so well. Still, maybe it’s the best way to describe a story with a classic rock soundtrack.
August 31, 2016
Baseball is the most romantic of all sports, for many reasons. What other game unfolds in a space that fans and players refer to as their “field of dreams”? Unfortunately, those dreams are often shattered. Here in the Washington DC area, baseball has a long, melodramatic history, interrupted by 33 years of non-existence from 1972-2004. We’ve lost three franchises, a record few cities can match. Our most recent pennant was won in 1933. My late father used to recall that team nostalgically from time to time, but those memories are now lost to me. Decades of futility followed, not limited to incompetence on the field. Throughout its history and non-history, Washington baseball has been continually betrayed by bad owners and bad faith from the sport’s authorities, and some of those grievances have lingered into the present. Luckily, there is something poetic about suffering. Early in the 2016 season, when young slugger Bryce Harper struck out with the bases loaded, it was somewhat of a shock. He had hit two grand slams recently, and it had started to seem automatic. For the moment, pitchers had “figured him out.” Mighty Casey struck out that time, as he has many times before and will do again.
Baseball is more up close and personal than other sports. Except for catchers, the players play without masks, which makes it easy to imagine you know them. You would know them if you saw them on the street. When I was growing up in Silver Spring, Maryland, there were quite a few sightings of Senators star Frank Howard. Once he signed a dollar bill in the local grocery store for the mom of a friend. My parents saw him once in the Anchor Inn Restaurant, a huge man dining with his comparatively tiny wife. My mom, who was prone to exaggerate on occasion, claimed she almost tripped over his leg.
Baseball harkens back to childhood. Even the bad teams, then and now, exhibited exquisite, often breathtaking skills just to make what are considered routine plays. I used to fantasize about living at the ballpark. How cool would that be? I associated the game with warm summer evenings, rain delays, and rainbows arching over the stadium after the rain delays. At DC Stadium (later RFK Stadium), an accordion-like contraption called a Cordovox used to play “You Gotta Have Heart” and “I Know A Place.” Every time I hear those songs now, I’m back there.
Even the heartbreak was romantic. The Senators had such a unique way of grabbing defeat from the jaws of victory. One afternoon I overheard one of our neighbors, known to be an ardent Senators fan, screaming at her husband. They were a nice couple, never known to raise their voices to each other. It turned out that the wife was yelling, “How could they have lost?” Her poor husband had just broken the news he’d heard over the radio, that the Senators had blown what she had assumed to be an insurmountable lead. That dear lady died far too soon, of complications from diabetes. My mom speculated that the horrendous game hastened her demise.
It’s easy to take the sport too seriously, especially in the current era of rising expectations. I’ll admit that as I get older, I’m less patient with failure. Do I really love baseball, or do I only love winning baseball? I can’t seem to rediscover the pure enjoyment of the game I used to have, win or lose. I’m ashamed to admit defeats can seriously cast me down. I almost care more about the Nats holding on to their divisional lead during these difficult final weeks of the 2016 season than I care about the upcoming election, although I truly believe that one of the candidates poses an existential threat to the nation, if not to the planet. I can tell myself I’m being ridiculous, but I’m not a psychologist, so I haven’t really figured out why I’m like this. Maybe it’s just that time is growing shorter for a championship through which I could live vicariously.
Losing streaks feel like curses, and sometimes it feels like they’re my fault. Back in June I took a trip to California with my brother and a friend. Our main objective was to soak up some Hollywood vibes, but we also had a chance to catch the Nationals on a West Coast swing. They were playing the Padres and the Dodgers, two teams they should theoretically have been able to beat. Petco Park in San Diego is a particularly fun place, featuring an amusement park outside and a beach-like plot of grass beyond center field. Since so many good seats were unoccupied, we never made it to the cheap seats we had bought, instead grabbing a prime spot in the lower deck. I was afraid we’d be busted for cheating, but there were no ushers around to enforce the rules. Did we really get away with it? I later feared the baseball gods had taken note, because that game helped to set off our team’s seven-game losing streak.
We went on to Dodger Stadium (sometimes more romantically called Chavez Ravine), which provided a totally different experience, as ballparks usually do. It’s the biggest ballpark in the major leagues and one of the oldest. On that first day of summer, the hottest day of the year in Los Angeles, fire billowed from the nearby hills, but once the sun went down the conditions turned surprisingly pleasant. This contest was slated to be an epic pitching matchup (Clayton Kershaw vs. Steven Strasburg), two legitimate Cy Young contenders earlier this season, who have both been bitten by injuries since. Strasburg was scratched that night for unknown reasons, replaced by a journeyman who did his best but was no match for the star.
The fun aspect of baseball is jeopardized when we take it too seriously. The season is brutally long, and sometimes the games are, too. Not much can be done about shortening the games without altering basic strategies, but how about shortening the season? It’ll never happen, because it would cost the owners immediate revenue, but I feel sure the quality of the day-to-day product would improve. Maybe we’d have to endure fewer gut-wrenching losses that might have been more mistake-free if the players hadn’t been dead on their feet after a long summer of exertion. The drama tends to return in the autumn, and the level of play sharpens with the cooler temperatures and the greater excitement of pennant races and playoffs. For Nats fans, though, playoff appearances in 2012 and 2014 brought more agony than ecstasy. What if one of these years, our team actually does win the World Series and gets to march down Pennsylvania Avenue in a never-before-seen baseball parade? It’ll outshine any Inaugural Parade. On the other hand, maybe it’s better to have something left to yearn for.
August 4, 2016
Science fiction has never been my favorite genre. Not that I didn’t get a kick out of reading or watching fanciful space travel stuff and freaking out over invading aliens when I was a kid, but my main rap against it was that it didn’t try hard enough to be real. By definition, all fiction is unreal, but for me a novel or movie works best when it comes close enough to recognizable real life to allow for the willing suspension of disbelief. I’m not denying that the Star Wars and Star Trek movies are fun to watch and sometimes even insightful about the human condition. But if space travel is ever to be real, we must seriously question if and how these things can be done. We can’t build spaceships on current models that can accelerate to “warp light speed.” Never will we venture way out yonder to find human-like creatures populating galaxy after galaxy, speaking perfect English. When it comes to the “war” part of Star Wars, we will probably never develop an arsenal fearsome enough to make whole worlds explode at the push of a button. Special effects don’t add up to reality.
Lately, however, the genre has been expanding with the growing popularity of what we might call scientific science fiction. Movies like Interstellar (2014) and The Martian (2015) push the edges of what is theoretically possible in space travel. The ordinary mind boggles at the concepts of theoretical physics and engineering feats that must be mastered to make these space forays possible. By “ordinary mind” I mean one like my own, lacking in scientific skills but greatly respectful of science. Yet our long-term survival as a species may depend on coming to grips with it all.
Carl Sagan’s 1985 novel, Contact, posited the idea of wormholes as a means of feasible space travel. A wormhole, if real, might enable us to defy the limitations imposed on us by relativity and gravity. These blips in time and space might allow for deep-space travel that could be completed within reasonable timeframes. One of Sagan’s scientific pursuits throughout his career was the search for extraterrestrial intelligence (SETI). The scientist-heroine of his story, Eleanor Arroway, receives the signal from space that Sagan dreamed of finding himself. It’s a repeating message composed of prime numbers, presumed to be the universal language of mathematics. The message, once fully decoded, reveals a blueprint for building a spaceship that can make use of a wormhole. Plenty of Earthlings are skeptical of the discovery. Who’s to say these aliens aren’t evil and diabolical, intent on luring us to our doom? Sagan’s theory was that a highly technical society capable of sending such a message must have passed some threshold of survival. The aliens had evidently developed nuclear and every other form of energy without destroying themselves in the process. That could only mean that they had learned, somehow, to live peaceably among themselves. They sensed that we were on the precipice, flirting with self-destruction, so they reached out.
Interstellar borrows the wormhole concept. According to side notes in the Kindle version of the movie, these ideas were developed by theoretical physicist Kip Thorne. “Based on warped space-time,” Dr. Thorne says, “the most exotic events in the universe suddenly becoming accessible to humans.” By contrast, The Martian seems tantalizingly close to present-day reality. We have already sent numerous unmanned probes to the red planet, and manned missions are on the drawing board. A proposed one-way trip is drawing plenty of applicants, despite the prospect of never returning home. NASA has a working prototype of the Mars Launcher Habitat used in the movie. The buried Pathfinder lander that figures in the story is an actual spacecraft built by NASA in the 1990s. One of the launches shown in the movie is actual footage of the Mars Science Laboratory launch.
The Martian isn’t about a one-way trip. On the contrary, its travelers hope to return to Earth to be welcomed back and lauded by their fellow citizens. The astronauts aboard the Hermes spacecraft mourn a crew member, Mark Watney, whom they had to leave behind after a catastrophic explosion apparently killed him. When they are informed that against all odds, he survived the incident, they return to rescue him, adding over 500 days to their mission.
The Martian is about adventure and exploration, however perilous. Interstellar, by contrast, is about desperation. The opening scenes show Earth in peril, with the crops dying, the air polluted, and water in short supply. It’s all too easy to believe this is an accurate snapshot of earth in coming decades. The need to escape this hellhole is urgent. Schools in these end days are training more farmers than engineers, but agriculture is still failing. They’re also teaching hopelessness, denying kids not only a future but rewriting history to deny that men ever walked on the moon.
A young girl, Murphy, feels haunted by a ghost in her bedroom. Her father, Cooper, an engineer who is also a pilot, shares her feeling that this apparition, whatever it is, has the answers. By taking a scientific rather than a mystical view of it, they discover a message leading to a facility where NASA scientists are working on the problem of human survival. There are all sorts of theories, problems, and equations that must be worked out. Gravity anomalies have been detected for the past 50 years, they say, but can an anomaly actually defy gravity? A possible wormhole has been detected near Saturn, a disturbance in space-time that might allow escape to another galaxy. Whoever is sending these messages must live in five dimensions, with a different conception of time than humans have. Can those of us limited to three dimensions ever understand them? Will humans ever be able to see into a black hole and uncover its mysteries? A space station has been built, based partly on the current International Space Station, but dependent on gravitational anomalies and a “mishmash of technologies.” A stash of fertilized eggs on board represents the only glimmer of hope.
Both movies have a lot to say about human emotions, which both help and hinder the fight for survival. “We must begin to think as a species, not as individuals,” declares the physicist who sends Cooper on his outer space mission, lying to him in the process. He believes he’s sending Cooper to colonize another planet, never to return. He can’t allow his true opinion of the mission to be known, because Cooper’s instincts as a parent would never allow him to undertake it if he believed it meant leaving his children and all the children of earth behind to starve. During the mission, one of Cooper’s colleagues risks everything by advocating colonization of a planet that is obviously an inferior choice. She’s in search of the man who explored it previously, who might still be alive, although it seems unlikely. Maybe, she declares, her love for him is a powerful force beyond human understanding, one she shouldn’t be expected to resist.
The Martian is more optimistic about present-day humanity than Interstellar. When Mark Watney is rescued and brought back to Earth, he lectures astronauts in training about the perilous life they have chosen. The fear and possibility of death will be close and constant, since outer space is cold and stark and does not cooperate with humanity. Only if they solve all the problems that come their way will they get to go home. That’s how Watney finally gets home—by solving one practical problem after another, against terrible odds, aided by a sense of humor. He knows his food rations won’t last long enough, so he must grow his own potatoes. “Luckily, I’m a botanist.” When this crop is destroyed in an unexpected mishap, he needs to create a source of water to prepare more soil. He knows water can be created by lighting up hydrogen, a prohibitively dangerous process but the only one available. So is digging up a radioactive energy source that was buried on a previous mission and remains deadly. At every step, he says he must “science the shit out of this.” A hundred different hazards could kill him, including the boring disco music collection that the commander of the mission left behind.
When the problems seem insurmountable, and death seems all but certain, Mark passes on a message to his parents: if he perishes, he did it for something big and beautiful. He didn’t do it because it was a choice between exploring new worlds and dying out as a species, as in Interstellar. He did it for curiosity, a love of learning, and the spirit of exploration, human traits that are probably just as essential to survival as food, water, and oxygen.
July 3, 2016
I first became acquainted with George Orwell’s 1984 when I was in high school during the LBJ and Nixon eras. Although first published in 1949, the book resonated with us baby boomers because of our generational grievances and distrust of authority. It resonated all the more because it was not a mere political treatise. At the center of it was an illicit, passionate love affair, something that our adolescent hormonal selves could relate to.
The question that each succeeding generation has to ask itself anew is whether the horrors of 1984 could happen here and now. Orwell’s world was divided into three regions, and Oceania, which included the former England, had become a marriage of high technology and totalitarianism. Now that we’re living in a high tech world that few could have foreseen a generation ago, does it make us more or less likely to succumb to dictators?
LBJ and Nixon engendered plenty of mistrust, but we now have a presidential candidate who leaves them in the dust. Not only is he impervious to facts and reason, a trait which many ideological politicians share, but he gets many of his “facts” from the least reliable and most easily inflamed social media outlets. Furthermore, he insists that whatever he proclaims to be a fact is irrefutable, even if our own observations tell us otherwise. On his say-so, he expects us to deny the evidence of our own senses, a concept called “denial of objective reality” in Orwell’s world. As Winston Smith wailed to his overseer O’Brien while in prison, “How can I help seeing what is in front of my eyes? Two and two are four.” Not always, O’Brien replies. Sometimes they are three and sometimes they are five. Sometimes they are all of them at once.
In the end, Nixon couldn’t get beyond the evidence that was preserved on his Oval Office tapes. What we heard couldn’t be unheard. That doesn’t seem to matter to Donald Trump. He has left a long, irrefutable record of unbelievably stupid statements and provocations, but if any of them become inconvenient to his election chances, he simply denies them. If he says something didn’t happen, it literally didn’t happen. He claims to have evidence that Obama wasn’t born in the United States, and that thousands of Muslims celebrated on 9-11. So where is this evidence? We just have to trust that his investigators found some fantastic stuff along these lines. If he denies that one of his goons assaulted a reporter, the video of that event must be lying. Just throw it down the memory hole. He’s also mastered the art of taking contradictory positions at the same time: classic doublethink. He whips his followers into a Two Minutes Hate, so that they never have the time or inclination to think for themselves.
It’s worth considering what happened to art and literature in Orwell’s 1984. As in all totalitarian societies, it still has its uses, but only for purposes that serve the Party. Winston Smith is an intellectual, buttoned-down type who can’t wait to get his hands on a forbidden book that will explain how and why the society he’s living in came about, and how it might be destroyed. His lover, Julia, is a much younger, more sensuous person who only cares about sleeping with him, not probing his mind or considering the political ramifications of their lovemaking. Winston’s job involves rewriting and falsifying the public record when necessary to make the Party look good. Julia works in the Fiction Department, but her skills are best suited to the non-literary part of the job, servicing the machinery used to mass-produce books. She describes the process of composing a novel: “from the general directive issued by the Planning Committee down to the final touching-up by the Rewrite Squad. But she was not interested in the final product … books were just a commodity that had to be produced, like jam or bootlaces.” There is also a subsection of the Fiction Department that produces pornography, based on six recurring plots. The sealed booklets are targeted to proletarian youths (the “poorly educated” in Trump’s endearing terminology), to give them the illusion that they are doing something slightly illegal.
Our country is in danger of casting aside its precious, hard-won democracy and embracing a real-life Big Brother. Donald Trump has already demonstrated his dictatorial bent. He responds to any hint of criticism with threats, insults, and tantrums. In fact, he expects nothing short of nonstop adulation. Could such a president seriously compromise our freedom to read and write what we choose? His threats to “loosen up” existing libel laws, so that he can sue media outlets that are mean to him, is already having a chilling effect. We can only hope it will prove more difficult than he expects to remake America into a place he can rule with an iron fist, but there is no doubt he intends to try. It’s the responsibility of rational people to do everything legally possible to stop him. America is not a place where the Thought Police should hold sway.
June 2, 2016
Traditional publishers will probably never embrace independent authors as equals. They will be loath to admit that the terms of engagement in this ongoing battle are changing, that the combatants are becoming more equal, and that some authors even find a way to go “hybrid.” It’s becoming increasingly clear that the trads are losing the high ground they once held in the area of editorial standards.
Examples of bad editing crop up more and more in the traditional world. For example, there are few authors more successful at traditional publishing than Anne Rice. She also specializes in the hottest subjects in fiction, vampires and werewolves. Yet Floyd Orr, editor of the long-running review site PODBRAM, and a rabid Rice fan, reports: “Anne Rice’s 34th book contains more errors than I have ever seen in a top-selling, traditionally published hardback! There are errors of every kind: repeated common words, misused spellings of words that are real words that actually mean something else, misuse of tense, and various other types of boo-boos. What do these errors all have in common? They are the sort that appear in books because human eyes did not read and reread and proofread the text before publishing it. There was an obvious reliance on computer programs to find the errors. Was this by Ms. Rice, her editor, or Knopf in general? Who knows?” Floyd kindly goes on to point out that the error count of Rice’s book easily surpasses those of several of the self-published books he has reviewed, including my own Handmaidens of Rock.
Trads were guilty from the start of not fighting this war honestly, but things have progressed to the point that self-published authors don’t have to suffer the same nonsense anymore. They can take or leave “friendly advice” from self-appointed arbiters of what deserves to be published. No doubt these experts will persist in warning us against “vanity” publishers, a term that should have been deep-sixed years ago. We can now call out websites that masquerade as help for the self-published, but are actually designed to discourage us. Certainly there are bad self-published books, but the argument that we’re all equally bad doesn’t hold water, any more than the argument that traditional publishing guarantees quality.
Several years ago, I sent my 2007 novel, The Rock Star’s Homecoming, to a site called “The Self-Publishing Review,” a blog run by an author who’d had a fair amount of success in publishing non-fiction. Some speculated that her generic-sounding name might be a pseudonym to protect herself from backlash. Certainly the name of her blog was misleading. Once I had read a sampling of her “reviews,” it became clear to me that these were something else altogether. By any fair standard, a reader who purports to provide a review must, at the very least, read the book. Her object was to throw cold water on authors by subjecting them to the kind of treatment they would receive if they sent their manuscripts to a “legitimate” publisher. Admittedly, that might be a useful service, but it was not what she advertised.
To be fair, she warned us: “I’m an editor, and expect published books to be polished. I’m going to count all the errors I find in spelling, punctuation and grammar and when I reach fifteen I’m going to stop reading. I’ll work my way through up to five pages of boring prose or bad writing before I give up.” Despite that stern warning, I felt okay about sending her my novel, although it had to be shipped overseas at some expense. I’ve been something of an editor myself during many years of technical writing for the Federal government. I knew I had gone over my novel carefully and that it had been edited by professionals.
My book, like almost every other that this hot-shot editor “reviewed,” was discarded after about seven pages because of alleged mistakes. I was sure there were not fifteen errors of the type she warned against in the whole book, much less in the first seven pages. When I asked for an explanation, she admitted that there was nothing wrong with my “spelling, punctuation and grammar” per se. My sin was “exposition,” apparently a common complaint against self-published authors, and a handy one if the arbiters can’t find more obvious mistakes.
What does this sin consist of, exactly? Wikipedia defines exposition as “the insertion of important background information within a story; for example, information about the setting, characters’ backstories, prior plot events, historical context, etc.” The article quotes fantasy and science fiction author Jo Walton on the importance of “scattering information seamlessly through the text, as opposed to stopping the story to impart the information.”
My problem with this criticism, legitimate though it might be, is that famous authors do it with impunity. I pointed out that two of my favorites, Pat Conroy and Gail Godwin, tend to not even start their stories until the scene is thoroughly set. If any arbiter tried to impose rules on them, about exposition or anything else, they’d laugh in that person’s face. Ah, the arbiters say, but there’s a right way and a wrong way to do it. All I conclude from this is that it’s always wrong when self-published authors do it.
What about the credentials of these arbiters? Despite their successes in the non-fiction realm, they tend to be sitting on piles of unpublished novels like everyone else. Ironically, that’s where they’re offering their harshest criticism. Since self-publishing is for losers, they disdain that route—although they might admit to putting excerpts of their novels on the Internet, as if that were not a form of self-publishing.
We’ve all heard plenty of those traditional “success stories,” touting the efforts of authors who kept writing and rewriting the same story for fifteen or twenty years, submitting it to numerous agents and publishers, revising and starting over to suit each new critic, perhaps even trying to re-envision their stories as plays or screenplays. Sometimes two decades of effort and perseverance are indeed “rewarded,” but that’s not my idea of success. How many other stories could these authors have been writing during those endless years spent twisting their original vision a hundred different ways to suit one critic after another? Was the original inspiration even recognizable by then? Fortunately, no one has to settle for this kind of treatment any more. The fight rages on, with one of the combatants, in my opinion, looking increasingly desperate.
May 2, 2016
I’ve spent my entire life living in a suburban cocoon, sheltered from the world’s harshest realities. I always knew that the famines, decades-long civil wars, and military coups that regularly decimate foreign countries can’t possibly happen here. Lucky me, I was born in the United States in a time of relative prosperity, although the political landscape has never been what you could call tranquil. I’m a baby boomer, by definition the child of a World War Two veteran. The Greatest Generation, my parents’ generation, fought the most virulent forms of Fascism in Germany and Japan to ensure that those scourges couldn’t invade our lives. True, we lived our entire lives under a nuclear cloud, practicing futile remedies like duck-and-cover when we were kids, but we could count on Mutually Assured Destruction to keep us safe. It seemed the Soviets, like us, weren’t totally crazy.
There have been many books and movies that plausibly envision all sorts of nightmare scenarios. Some of the “what-ifs” that have made the greatest impression on me include 1984 (what a high-tech totalitarian society would look like), It Happened Here (if England had lost the Battle of Britain and been conquered by Hitler), Seven Days In May (if the US military attempted to overthrow the president), and most chilling of all, Level Seven (total nuclear annihilation). After imagining the worst, I feel relieved that it hasn’t happened yet. I have an urge to step outside and breathe in the sights and sounds of my own lawn, where life persists, unaware of any existential threat.
Nobody in the US could claim at any time that Fascism had been totally defeated. It has always been present, at least beneath the surface, in our national political life. Lately, it has begun to get alarmingly obvious. It’s not necessarily a good sign that we see more and more fictional presidents who are either totalitarian wannabes or buffoons. I was criticized for portraying an over-the-top president in my own novel, Let’s Play Ball, but he was small potatoes. All he did was have adulterous sex in the oval office (real sex, not just oral) and hatch a plot to kidnap a ballplayer. What I wrote can’t hold a candle to appalling but undeniably entertaining shows like “House of Cards.”
How believable is Frank Underwood, the fictional president of this series? As of this date, he’s already murdered two people by his own hand, and has an equally thuggish chief of staff doing dirty work for him on the side. In one recent scene with his own Secretary of Defense, a potential political rival, he seems to confess to his previous murders and threaten her with the same fate, only to back off and say he’s kidding. As he obviously intends, she is left unsure whether she’s really in danger or just paranoid. Yet if Underwood were real, I would vote for him over several of the current presidential candidates. I’d even vote for his wife Claire, who has maneuvered herself into a spot on his ticket. The Underwoods at least take some reasonable positions, if only for expediency’s sake.
When I studied Political Science in graduate school over thirty years ago, I thought I had acquired a decent grasp of “what can’t happen here” in the political realm. But in 2016, we might as well shred those rules. There are practically no limits now to what certain candidates can say or do and remain beloved by their fans. I suspect that deep down, many of these politicians know how unreasonable their positions are, but they have gotten into the habit of pandering to an uninformed electorate instead of trying to educate their followers. Since there’s no point in agonizing over what I can’t control, I amuse myself these days by pretending that the ongoing election is a novel, and the candidates are colorful if implausible characters. Only what self-respecting editor wouldn’t red-ink characters like these? Honestly, I don’t think I have enough imagination to make up Donald Trump.
When has such an ignorant buffoon come so close to the presidency, even in fiction? Charlie Chaplin’s 1940 film “The Great Dictator” satirized characters such as Adenoid Hynkel of Tomainia and Benzino Napaloni of Bacteria, who tragically had real-life counterparts. Is the present situation so different? Here is a man who accepts Nazi-like salutes from his followers, has expressed admiration for Mussolini, has reportedly studied Hitler’s speeches, and encourages his goons to beat up any detractors. He refuses to repudiate the support of white supremacists, and threatens media outlets that criticize him. But let’s jump ahead and contemplate what President Trump would be like. Since candidate Trump has yet to show the slightest grasp of how the US government works, we must assume he would expect to enter office wielding dictatorial powers. How would he react when he discovers the concept of checks and balances? Michael Hayden, former NSA and CIA director, has said that some of the orders Trump intends to issue as commander-in-chief would be illegal, and could well trigger a coup. His signature policy initiative would start at least a trade war with Mexico, our largest trading partner, if not a hot war. Meanwhile, if he follows through on his promise to deport 11 million immigrants, he would create all the necessary ingredients for a civil war.
Would Trump last even a year in office? I can see him quitting in frustration when he finds the job more difficult than he imagined. Certainly some of his actions, if he tried to carry them out, would be impeachable. Like Chaplin’s dictators, he’s extremely childish, given to temper tantrums if he doesn’t receive the adulation he thinks he deserves or otherwise fails to get his way. Imagine putting someone with the temperament of a five-year-old in charge of the military and the nuclear codes. He might blow us off the map before we had a chance to impeach him.
This is really nothing new for Republican candidates, many of whom have demonstrated an appalling ignorance of our national history. Trump himself was recently asked in an in-depth interview what he thought of Lincoln’s accomplishments. After he had rambled for several minutes, it was clear to the interviewers that he had no idea what Lincoln did. He touts an “America First” foreign policy, showing no awareness of what that phrase meant in the late 1930s. Sarah Palin, former Republican VP candidate, didn’t know that World Wars One and Two were fought in the twentieth century. And then there’s my personal favorite, former Congresswoman and presidential candidate Michelle Bachmann congratulating the founding fathers for their great courage and fortitude in abolishing slavery.
American history is a great story in itself, and really needs no exaggeration or enhancement. It has all the drama, vivid characters, crises, and triumphs that anyone could want. Even some of the duller personalities that populate our history were interesting in their own plodding way. Maybe that’s why we crave over-the-top scenarios in fiction that portray our leaders as criminals, clowns or worse. Authors must do all they can to keep up with reality.
March 15, 2016
If I had to choose the author whose works entranced me most as a child, it would be Laura Ingalls Wilder. My fascination with Laura began in the fourth grade, when I was introduced to Little House In The Big Woods. This book was clearly intended to teach us kids who were living cushy suburban lives what it was like to grow up in a pioneer family. The books, and the seven that followed it, were all about survival and self-sufficiency in places where civilization as we know it had not yet penetrated.
The Ingalls family saga began in the Big Woods of Wisconsin, where they had to eke out a living from hunting and raising crops on small patches of cleared land among the trees, all the while fighting off bears and panthers who roamed the woods freely. When Laura was about four, the family moved on to Indian territory in what is now Kansas, in search of more fertile land. Non-Indian settlement there wasn’t strickly legal yet, according to the Federal government, and both the Feds and the natives took steps to get rid of the interlopers. Then on to Minnesota, where marauding grasshoppers destroyed the family’s crops. Tragedy struck when Laura’s older sister Mary was struck blind as the result of an illness that could not be pinpointed at the time. Although expensive doctors were called in and the bills piled up, nothing could be done for her.
When the Ingalls family moved to Dakota Territory, their final stop, they lived and worked for a while in a railroad camp, where Laura’s father Charles was the paymaster. He was threatened with beatings or worse when the pay was late. Even once the family settled on its own homestead, they dealt with one crisis after another. The weather alone could be a backbreaker. The legendary winter of 1880-81 merited a book of its own (The Long Winter). Summer tornadoes often proved just as destructive.
I didn’t realize as a child that these books were fiction. It was easy to assume that they were literal truth because their level of detail is so vivid. That is why the recent publication of Pioneer Girl: The Annotated Autobiography, edited by Pamela Smith Hill, is so intriguing. It features the original memoir that Laura wrote prior to beginning work on her series, including all her misspellings and grammatical errors, and sometimes lapses of memory. Numerous footnotes are included that explain the actual history that inspired the series, and help to separate truth from fiction.
Laura was assisted by her daughter, Rose Wilder Lane, in shaping the fictional works. Lane was an established author who had written several biographies that crossed the line between fact and fiction, angering some of her subjects. She advised her mother to use similar techniques, but without the pretense that they were straight autobiography. To make the novels more dramatic, yet suitable for children, they altered certain events, created some new and composite characters, and glossed over or excluded some of the family’s grimmest experiences. For example, the family’s sojourn in Burr Oak, Iowa, where they helped to run a hotel, was not included in the series. Presumably their proximity to a saloon, where Laura observed drunkeness and other questionable behavior when not yet a teenager, made it unsuitable for young readers. The death of Laura’s baby brother around this time was also deemed too dark an episode to deal with.
Reading the “true story” has made me aware of a more important omission that, in my opinion, prevents the books from telling the entire truth. Although the themes of self-sufficiency and resilience were genuine enough, they sidestep the fact that there were times when the family needed help from the various governments under which they lived. During their Minnesota sojourn, after the grasshoppers wiped them out, Charles Ingalls was forced to apply for assistance to feed his family. Later in Dakota Territory, after blackbirds had destroyed their corn crop, it appeared that the family’s long-cherished plan to send Mary to a college for the blind in Iowa might be finished. The fictional version of the story dramatizes Charles Ingalls’s decision to sell his heifer calf to raise the necessary funds. This would be a considerable sacrifice, setting him back at least a year in establishing his farm as a fully functioning entity. The true story, however, is that Mary participated in a program established by the Dakota territorial government to educate blind students for five years at the nearest suitable institution.
In our current polarized political climate, there seems to be scant middle ground between those who believe government is an evil force that makes people too dependent, and those who believe government can solve every problem. The moderate voices that ought to be heard are being shouted down by the loudest, rudest voices. I still love Laura and her adventures as much as I ever did. The Ingalls family indeed persevered through many trials and demonstrated great strength of character. But it would have been no shame to admit that from time to time, they and other pioneers needed the sort of helping hand that government programs could provide.