August 31, 2016
Baseball is the most romantic of all sports, for many reasons. What other game unfolds in a space that fans and players refer to as their “field of dreams”? Unfortunately, those dreams are often shattered. Here in the Washington DC area, baseball has a long, melodramatic history, interrupted by 33 years of non-existence from 1972-2004. We’ve lost three franchises, a record few cities can match. Our most recent pennant was won in 1933. My late father used to recall that team nostalgically from time to time, but those memories are now lost to me. Decades of futility followed, not limited to incompetence on the field. Throughout its history and non-history, Washington baseball has been continually betrayed by bad owners and bad faith from the sport’s authorities, and some of those grievances have lingered into the present. Luckily, there is something poetic about suffering. Early in the 2016 season, when young slugger Bryce Harper struck out with the bases loaded, it was somewhat of a shock. He had hit two grand slams recently, and it had started to seem automatic. For the moment, pitchers had “figured him out.” Mighty Casey struck out that time, as he has many times before and will do again.
Baseball is more up close and personal than other sports. Except for catchers, the players play without masks, which makes it easy to imagine you know them. You would know them if you saw them on the street. When I was growing up in Silver Spring, Maryland, there were quite a few sightings of Senators star Frank Howard. Once he signed a dollar bill in the local grocery store for the mom of a friend. My parents saw him once in the Anchor Inn Restaurant, a huge man dining with his comparatively tiny wife. My mom, who was prone to exaggerate on occasion, claimed she almost tripped over his leg.
Baseball harkens back to childhood. Even the bad teams, then and now, exhibited exquisite, often breathtaking skills just to make what are considered routine plays. I used to fantasize about living at the ballpark. How cool would that be? I associated the game with warm summer evenings, rain delays, and rainbows arching over the stadium after the rain delays. At DC Stadium (later RFK Stadium), an accordion-like contraption called a Cordovox used to play “You Gotta Have Heart” and “I Know A Place.” Every time I hear those songs now, I’m back there.
Even the heartbreak was romantic. The Senators had such a unique way of grabbing defeat from the jaws of victory. One afternoon I overheard one of our neighbors, known to be an ardent Senators fan, screaming at her husband. They were a nice couple, never known to raise their voices to each other. It turned out that the wife was yelling, “How could they have lost?” Her poor husband had just broken the news he’d heard over the radio, that the Senators had blown what she had assumed to be an insurmountable lead. That dear lady died far too soon, of complications from diabetes. My mom speculated that the horrendous game hastened her demise.
It’s easy to take the sport too seriously, especially in the current era of rising expectations. I’ll admit that as I get older, I’m less patient with failure. Do I really love baseball, or do I only love winning baseball? I can’t seem to rediscover the pure enjoyment of the game I used to have, win or lose. I’m ashamed to admit defeats can seriously cast me down. I almost care more about the Nats holding on to their divisional lead during these difficult final weeks of the 2016 season than I care about the upcoming election, although I truly believe that one of the candidates poses an existential threat to the nation, if not to the planet. I can tell myself I’m being ridiculous, but I’m not a psychologist, so I haven’t really figured out why I’m like this. Maybe it’s just that time is growing shorter for a championship through which I could live vicariously.
Losing streaks feel like curses, and sometimes it feels like they’re my fault. Back in June I took a trip to California with my brother and a friend. Our main objective was to soak up some Hollywood vibes, but we also had a chance to catch the Nationals on a West Coast swing. They were playing the Padres and the Dodgers, two teams they should theoretically have been able to beat. Petco Park in San Diego is a particularly fun place, featuring an amusement park outside and a beach-like plot of grass beyond center field. Since so many good seats were unoccupied, we never made it to the cheap seats we had bought, instead grabbing a prime spot in the lower deck. I was afraid we’d be busted for cheating, but there were no ushers around to enforce the rules. Did we really get away with it? I later feared the baseball gods had taken note, because that game helped to set off our team’s seven-game losing streak.
We went on to Dodger Stadium (sometimes more romantically called Chavez Ravine), which provided a totally different experience, as ballparks usually do. It’s the biggest ballpark in the major leagues and one of the oldest. On that first day of summer, the hottest day of the year in Los Angeles, fire billowed from the nearby hills, but once the sun went down the conditions turned surprisingly pleasant. This contest was slated to be an epic pitching matchup (Clayton Kershaw vs. Steven Strasburg), two legitimate Cy Young contenders earlier this season, who have both been bitten by injuries since. Strasburg was scratched that night for unknown reasons, replaced by a journeyman who did his best but was no match for the star.
The fun aspect of baseball is jeopardized when we take it too seriously. The season is brutally long, and sometimes the games are, too. Not much can be done about shortening the games without altering basic strategies, but how about shortening the season? It’ll never happen, because it would cost the owners immediate revenue, but I feel sure the quality of the day-to-day product would improve. Maybe we’d have to endure fewer gut-wrenching losses that might have been more mistake-free if the players hadn’t been dead on their feet after a long summer of exertion. The drama tends to return in the autumn, and the level of play sharpens with the cooler temperatures and the greater excitement of pennant races and playoffs. For Nats fans, though, playoff appearances in 2012 and 2014 brought more agony than ecstasy. What if one of these years, our team actually does win the World Series and gets to march down Pennsylvania Avenue in a never-before-seen baseball parade? It’ll outshine any Inaugural Parade. On the other hand, maybe it’s better to have something left to yearn for.
August 4, 2016
Science fiction has never been my favorite genre. Not that I didn’t get a kick out of reading or watching fanciful space travel stuff and freaking out over invading aliens when I was a kid, but my main rap against it was that it didn’t try hard enough to be real. By definition, all fiction is unreal, but for me a novel or movie works best when it comes close enough to recognizable real life to allow for the willing suspension of disbelief. I’m not denying that the Star Wars and Star Trek movies are fun to watch and sometimes even insightful about the human condition. But if space travel is ever to be real, we must seriously question if and how these things can be done. We can’t build spaceships on current models that can accelerate to “warp light speed.” Never will we venture way out yonder to find human-like creatures populating galaxy after galaxy, speaking perfect English. When it comes to the “war” part of Star Wars, we will probably never develop an arsenal fearsome enough to make whole worlds explode at the push of a button. Special effects don’t add up to reality.
Lately, however, the genre has been expanding with the growing popularity of what we might call scientific science fiction. Movies like Interstellar (2014) and The Martian (2015) push the edges of what is theoretically possible in space travel. The ordinary mind boggles at the concepts of theoretical physics and engineering feats that must be mastered to make these space forays possible. By “ordinary mind” I mean one like my own, lacking in scientific skills but greatly respectful of science. Yet our long-term survival as a species may depend on coming to grips with it all.
Carl Sagan’s 1985 novel, Contact, posited the idea of wormholes as a means of feasible space travel. A wormhole, if real, might enable us to defy the limitations imposed on us by relativity and gravity. These blips in time and space might allow for deep-space travel that could be completed within reasonable timeframes. One of Sagan’s scientific pursuits throughout his career was the search for extraterrestrial intelligence (SETI). The scientist-heroine of his story, Eleanor Arroway, receives the signal from space that Sagan dreamed of finding himself. It’s a repeating message composed of prime numbers, presumed to be the universal language of mathematics. The message, once fully decoded, reveals a blueprint for building a spaceship that can make use of a wormhole. Plenty of Earthlings are skeptical of the discovery. Who’s to say these aliens aren’t evil and diabolical, intent on luring us to our doom? Sagan’s theory was that a highly technical society capable of sending such a message must have passed some threshold of survival. The aliens had evidently developed nuclear and every other form of energy without destroying themselves in the process. That could only mean that they had learned, somehow, to live peaceably among themselves. They sensed that we were on the precipice, flirting with self-destruction, so they reached out.
Interstellar borrows the wormhole concept. According to side notes in the Kindle version of the movie, these ideas were developed by theoretical physicist Kip Thorne. “Based on warped space-time,” Dr. Thorne says, “the most exotic events in the universe suddenly becoming accessible to humans.” By contrast, The Martian seems tantalizingly close to present-day reality. We have already sent numerous unmanned probes to the red planet, and manned missions are on the drawing board. A proposed one-way trip is drawing plenty of applicants, despite the prospect of never returning home. NASA has a working prototype of the Mars Launcher Habitat used in the movie. The buried Pathfinder lander that figures in the story is an actual spacecraft built by NASA in the 1990s. One of the launches shown in the movie is actual footage of the Mars Science Laboratory launch.
The Martian isn’t about a one-way trip. On the contrary, its travelers hope to return to Earth to be welcomed back and lauded by their fellow citizens. The astronauts aboard the Hermes spacecraft mourn a crew member, Mark Watney, whom they had to leave behind after a catastrophic explosion apparently killed him. When they are informed that against all odds, he survived the incident, they return to rescue him, adding over 500 days to their mission.
The Martian is about adventure and exploration, however perilous. Interstellar, by contrast, is about desperation. The opening scenes show Earth in peril, with the crops dying, the air polluted, and water in short supply. It’s all too easy to believe this is an accurate snapshot of earth in coming decades. The need to escape this hellhole is urgent. Schools in these end days are training more farmers than engineers, but agriculture is still failing. They’re also teaching hopelessness, denying kids not only a future but rewriting history to deny that men ever walked on the moon.
A young girl, Murphy, feels haunted by a ghost in her bedroom. Her father, Cooper, an engineer who is also a pilot, shares her feeling that this apparition, whatever it is, has the answers. By taking a scientific rather than a mystical view of it, they discover a message leading to a facility where NASA scientists are working on the problem of human survival. There are all sorts of theories, problems, and equations that must be worked out. Gravity anomalies have been detected for the past 50 years, they say, but can an anomaly actually defy gravity? A possible wormhole has been detected near Saturn, a disturbance in space-time that might allow escape to another galaxy. Whoever is sending these messages must live in five dimensions, with a different conception of time than humans have. Can those of us limited to three dimensions ever understand them? Will humans ever be able to see into a black hole and uncover its mysteries? A space station has been built, based partly on the current International Space Station, but dependent on gravitational anomalies and a “mishmash of technologies.” A stash of fertilized eggs on board represents the only glimmer of hope.
Both movies have a lot to say about human emotions, which both help and hinder the fight for survival. “We must begin to think as a species, not as individuals,” declares the physicist who sends Cooper on his outer space mission, lying to him in the process. He believes he’s sending Cooper to colonize another planet, never to return. He can’t allow his true opinion of the mission to be known, because Cooper’s instincts as a parent would never allow him to undertake it if he believed it meant leaving his children and all the children of earth behind to starve. During the mission, one of Cooper’s colleagues risks everything by advocating colonization of a planet that is obviously an inferior choice. She’s in search of the man who explored it previously, who might still be alive, although it seems unlikely. Maybe, she declares, her love for him is a powerful force beyond human understanding, one she shouldn’t be expected to resist.
The Martian is more optimistic about present-day humanity than Interstellar. When Mark Watney is rescued and brought back to Earth, he lectures astronauts in training about the perilous life they have chosen. The fear and possibility of death will be close and constant, since outer space is cold and stark and does not cooperate with humanity. Only if they solve all the problems that come their way will they get to go home. That’s how Watney finally gets home—by solving one practical problem after another, against terrible odds, aided by a sense of humor. He knows his food rations won’t last long enough, so he must grow his own potatoes. “Luckily, I’m a botanist.” When this crop is destroyed in an unexpected mishap, he needs to create a source of water to prepare more soil. He knows water can be created by lighting up hydrogen, a prohibitively dangerous process but the only one available. So is digging up a radioactive energy source that was buried on a previous mission and remains deadly. At every step, he says he must “science the shit out of this.” A hundred different hazards could kill him, including the boring disco music collection that the commander of the mission left behind.
When the problems seem insurmountable, and death seems all but certain, Mark passes on a message to his parents: if he perishes, he did it for something big and beautiful. He didn’t do it because it was a choice between exploring new worlds and dying out as a species, as in Interstellar. He did it for curiosity, a love of learning, and the spirit of exploration, human traits that are probably just as essential to survival as food, water, and oxygen.
July 3, 2016
I first became acquainted with George Orwell’s 1984 when I was in high school during the LBJ and Nixon eras. Although first published in 1949, the book resonated with us baby boomers because of our generational grievances and distrust of authority. It resonated all the more because it was not a mere political treatise. At the center of it was an illicit, passionate love affair, something that our adolescent hormonal selves could relate to.
The question that each succeeding generation has to ask itself anew is whether the horrors of 1984 could happen here and now. Orwell’s world was divided into three regions, and Oceania, which included the former England, had become a marriage of high technology and totalitarianism. Now that we’re living in a high tech world that few could have foreseen a generation ago, does it make us more or less likely to succumb to dictators?
LBJ and Nixon engendered plenty of mistrust, but we now have a presidential candidate who leaves them in the dust. Not only is he impervious to facts and reason, a trait which many ideological politicians share, but he gets many of his “facts” from the least reliable and most easily inflamed social media outlets. Furthermore, he insists that whatever he proclaims to be a fact is irrefutable, even if our own observations tell us otherwise. On his say-so, he expects us to deny the evidence of our own senses, a concept called “denial of objective reality” in Orwell’s world. As Winston Smith wailed to his overseer O’Brien while in prison, “How can I help seeing what is in front of my eyes? Two and two are four.” Not always, O’Brien replies. Sometimes they are three and sometimes they are five. Sometimes they are all of them at once.
In the end, Nixon couldn’t get beyond the evidence that was preserved on his Oval Office tapes. What we heard couldn’t be unheard. That doesn’t seem to matter to Donald Trump. He has left a long, irrefutable record of unbelievably stupid statements and provocations, but if any of them become inconvenient to his election chances, he simply denies them. If he says something didn’t happen, it literally didn’t happen. He claims to have evidence that Obama wasn’t born in the United States, and that thousands of Muslims celebrated on 9-11. So where is this evidence? We just have to trust that his investigators found some fantastic stuff along these lines. If he denies that one of his goons assaulted a reporter, the video of that event must be lying. Just throw it down the memory hole. He’s also mastered the art of taking contradictory positions at the same time: classic doublethink. He whips his followers into a Two Minutes Hate, so that they never have the time or inclination to think for themselves.
It’s worth considering what happened to art and literature in Orwell’s 1984. As in all totalitarian societies, it still has its uses, but only for purposes that serve the Party. Winston Smith is an intellectual, buttoned-down type who can’t wait to get his hands on a forbidden book that will explain how and why the society he’s living in came about, and how it might be destroyed. His lover, Julia, is a much younger, more sensuous person who only cares about sleeping with him, not probing his mind or considering the political ramifications of their lovemaking. Winston’s job involves rewriting and falsifying the public record when necessary to make the Party look good. Julia works in the Fiction Department, but her skills are best suited to the non-literary part of the job, servicing the machinery used to mass-produce books. She describes the process of composing a novel: “from the general directive issued by the Planning Committee down to the final touching-up by the Rewrite Squad. But she was not interested in the final product … books were just a commodity that had to be produced, like jam or bootlaces.” There is also a subsection of the Fiction Department that produces pornography, based on six recurring plots. The sealed booklets are targeted to proletarian youths (the “poorly educated” in Trump’s endearing terminology), to give them the illusion that they are doing something slightly illegal.
Our country is in danger of casting aside its precious, hard-won democracy and embracing a real-life Big Brother. Donald Trump has already demonstrated his dictatorial bent. He responds to any hint of criticism with threats, insults, and tantrums. In fact, he expects nothing short of nonstop adulation. Could such a president seriously compromise our freedom to read and write what we choose? His threats to “loosen up” existing libel laws, so that he can sue media outlets that are mean to him, is already having a chilling effect. We can only hope it will prove more difficult than he expects to remake America into a place he can rule with an iron fist, but there is no doubt he intends to try. It’s the responsibility of rational people to do everything legally possible to stop him. America is not a place where the Thought Police should hold sway.
June 2, 2016
Traditional publishers will probably never embrace independent authors as equals. They will be loath to admit that the terms of engagement in this ongoing battle are changing, that the combatants are becoming more equal, and that some authors even find a way to go “hybrid.” It’s becoming increasingly clear that the trads are losing the high ground they once held in the area of editorial standards.
Examples of bad editing crop up more and more in the traditional world. For example, there are few authors more successful at traditional publishing than Anne Rice. She also specializes in the hottest subjects in fiction, vampires and werewolves. Yet Floyd Orr, editor of the long-running review site PODBRAM, and a rabid Rice fan, reports: “Anne Rice’s 34th book contains more errors than I have ever seen in a top-selling, traditionally published hardback! There are errors of every kind: repeated common words, misused spellings of words that are real words that actually mean something else, misuse of tense, and various other types of boo-boos. What do these errors all have in common? They are the sort that appear in books because human eyes did not read and reread and proofread the text before publishing it. There was an obvious reliance on computer programs to find the errors. Was this by Ms. Rice, her editor, or Knopf in general? Who knows?” Floyd kindly goes on to point out that the error count of Rice’s book easily surpasses those of several of the self-published books he has reviewed, including my own Handmaidens of Rock.
Trads were guilty from the start of not fighting this war honestly, but things have progressed to the point that self-published authors don’t have to suffer the same nonsense anymore. They can take or leave “friendly advice” from self-appointed arbiters of what deserves to be published. No doubt these experts will persist in warning us against “vanity” publishers, a term that should have been deep-sixed years ago. We can now call out websites that masquerade as help for the self-published, but are actually designed to discourage us. Certainly there are bad self-published books, but the argument that we’re all equally bad doesn’t hold water, any more than the argument that traditional publishing guarantees quality.
Several years ago, I sent my 2007 novel, The Rock Star’s Homecoming, to a site called “The Self-Publishing Review,” a blog run by an author who’d had a fair amount of success in publishing non-fiction. Some speculated that her generic-sounding name might be a pseudonym to protect herself from backlash. Certainly the name of her blog was misleading. Once I had read a sampling of her “reviews,” it became clear to me that these were something else altogether. By any fair standard, a reader who purports to provide a review must, at the very least, read the book. Her object was to throw cold water on authors by subjecting them to the kind of treatment they would receive if they sent their manuscripts to a “legitimate” publisher. Admittedly, that might be a useful service, but it was not what she advertised.
To be fair, she warned us: “I’m an editor, and expect published books to be polished. I’m going to count all the errors I find in spelling, punctuation and grammar and when I reach fifteen I’m going to stop reading. I’ll work my way through up to five pages of boring prose or bad writing before I give up.” Despite that stern warning, I felt okay about sending her my novel, although it had to be shipped overseas at some expense. I’ve been something of an editor myself during many years of technical writing for the Federal government. I knew I had gone over my novel carefully and that it had been edited by professionals.
My book, like almost every other that this hot-shot editor “reviewed,” was discarded after about seven pages because of alleged mistakes. I was sure there were not fifteen errors of the type she warned against in the whole book, much less in the first seven pages. When I asked for an explanation, she admitted that there was nothing wrong with my “spelling, punctuation and grammar” per se. My sin was “exposition,” apparently a common complaint against self-published authors, and a handy one if the arbiters can’t find more obvious mistakes.
What does this sin consist of, exactly? Wikipedia defines exposition as “the insertion of important background information within a story; for example, information about the setting, characters’ backstories, prior plot events, historical context, etc.” The article quotes fantasy and science fiction author Jo Walton on the importance of “scattering information seamlessly through the text, as opposed to stopping the story to impart the information.”
My problem with this criticism, legitimate though it might be, is that famous authors do it with impunity. I pointed out that two of my favorites, Pat Conroy and Gail Godwin, tend to not even start their stories until the scene is thoroughly set. If any arbiter tried to impose rules on them, about exposition or anything else, they’d laugh in that person’s face. Ah, the arbiters say, but there’s a right way and a wrong way to do it. All I conclude from this is that it’s always wrong when self-published authors do it.
What about the credentials of these arbiters? Despite their successes in the non-fiction realm, they tend to be sitting on piles of unpublished novels like everyone else. Ironically, that’s where they’re offering their harshest criticism. Since self-publishing is for losers, they disdain that route—although they might admit to putting excerpts of their novels on the Internet, as if that were not a form of self-publishing.
We’ve all heard plenty of those traditional “success stories,” touting the efforts of authors who kept writing and rewriting the same story for fifteen or twenty years, submitting it to numerous agents and publishers, revising and starting over to suit each new critic, perhaps even trying to re-envision their stories as plays or screenplays. Sometimes two decades of effort and perseverance are indeed “rewarded,” but that’s not my idea of success. How many other stories could these authors have been writing during those endless years spent twisting their original vision a hundred different ways to suit one critic after another? Was the original inspiration even recognizable by then? Fortunately, no one has to settle for this kind of treatment any more. The fight rages on, with one of the combatants, in my opinion, looking increasingly desperate.
May 2, 2016
I’ve spent my entire life living in a suburban cocoon, sheltered from the world’s harshest realities. I always knew that the famines, decades-long civil wars, and military coups that regularly decimate foreign countries can’t possibly happen here. Lucky me, I was born in the United States in a time of relative prosperity, although the political landscape has never been what you could call tranquil. I’m a baby boomer, by definition the child of a World War Two veteran. The Greatest Generation, my parents’ generation, fought the most virulent forms of Fascism in Germany and Japan to ensure that those scourges couldn’t invade our lives. True, we lived our entire lives under a nuclear cloud, practicing futile remedies like duck-and-cover when we were kids, but we could count on Mutually Assured Destruction to keep us safe. It seemed the Soviets, like us, weren’t totally crazy.
There have been many books and movies that plausibly envision all sorts of nightmare scenarios. Some of the “what-ifs” that have made the greatest impression on me include 1984 (what a high-tech totalitarian society would look like), It Happened Here (if England had lost the Battle of Britain and been conquered by Hitler), Seven Days In May (if the US military attempted to overthrow the president), and most chilling of all, Level Seven (total nuclear annihilation). After imagining the worst, I feel relieved that it hasn’t happened yet. I have an urge to step outside and breathe in the sights and sounds of my own lawn, where life persists, unaware of any existential threat.
Nobody in the US could claim at any time that Fascism had been totally defeated. It has always been present, at least beneath the surface, in our national political life. Lately, it has begun to get alarmingly obvious. It’s not necessarily a good sign that we see more and more fictional presidents who are either totalitarian wannabes or buffoons. I was criticized for portraying an over-the-top president in my own novel, Let’s Play Ball, but he was small potatoes. All he did was have adulterous sex in the oval office (real sex, not just oral) and hatch a plot to kidnap a ballplayer. What I wrote can’t hold a candle to appalling but undeniably entertaining shows like “House of Cards.”
How believable is Frank Underwood, the fictional president of this series? As of this date, he’s already murdered two people by his own hand, and has an equally thuggish chief of staff doing dirty work for him on the side. In one recent scene with his own Secretary of Defense, a potential political rival, he seems to confess to his previous murders and threaten her with the same fate, only to back off and say he’s kidding. As he obviously intends, she is left unsure whether she’s really in danger or just paranoid. Yet if Underwood were real, I would vote for him over several of the current presidential candidates. I’d even vote for his wife Claire, who has maneuvered herself into a spot on his ticket. The Underwoods at least take some reasonable positions, if only for expediency’s sake.
When I studied Political Science in graduate school over thirty years ago, I thought I had acquired a decent grasp of “what can’t happen here” in the political realm. But in 2016, we might as well shred those rules. There are practically no limits now to what certain candidates can say or do and remain beloved by their fans. I suspect that deep down, many of these politicians know how unreasonable their positions are, but they have gotten into the habit of pandering to an uninformed electorate instead of trying to educate their followers. Since there’s no point in agonizing over what I can’t control, I amuse myself these days by pretending that the ongoing election is a novel, and the candidates are colorful if implausible characters. Only what self-respecting editor wouldn’t red-ink characters like these? Honestly, I don’t think I have enough imagination to make up Donald Trump.
When has such an ignorant buffoon come so close to the presidency, even in fiction? Charlie Chaplin’s 1940 film “The Great Dictator” satirized characters such as Adenoid Hynkel of Tomainia and Benzino Napaloni of Bacteria, who tragically had real-life counterparts. Is the present situation so different? Here is a man who accepts Nazi-like salutes from his followers, has expressed admiration for Mussolini, has reportedly studied Hitler’s speeches, and encourages his goons to beat up any detractors. He refuses to repudiate the support of white supremacists, and threatens media outlets that criticize him. But let’s jump ahead and contemplate what President Trump would be like. Since candidate Trump has yet to show the slightest grasp of how the US government works, we must assume he would expect to enter office wielding dictatorial powers. How would he react when he discovers the concept of checks and balances? Michael Hayden, former NSA and CIA director, has said that some of the orders Trump intends to issue as commander-in-chief would be illegal, and could well trigger a coup. His signature policy initiative would start at least a trade war with Mexico, our largest trading partner, if not a hot war. Meanwhile, if he follows through on his promise to deport 11 million immigrants, he would create all the necessary ingredients for a civil war.
Would Trump last even a year in office? I can see him quitting in frustration when he finds the job more difficult than he imagined. Certainly some of his actions, if he tried to carry them out, would be impeachable. Like Chaplin’s dictators, he’s extremely childish, given to temper tantrums if he doesn’t receive the adulation he thinks he deserves or otherwise fails to get his way. Imagine putting someone with the temperament of a five-year-old in charge of the military and the nuclear codes. He might blow us off the map before we had a chance to impeach him.
This is really nothing new for Republican candidates, many of whom have demonstrated an appalling ignorance of our national history. Trump himself was recently asked in an in-depth interview what he thought of Lincoln’s accomplishments. After he had rambled for several minutes, it was clear to the interviewers that he had no idea what Lincoln did. He touts an “America First” foreign policy, showing no awareness of what that phrase meant in the late 1930s. Sarah Palin, former Republican VP candidate, didn’t know that World Wars One and Two were fought in the twentieth century. And then there’s my personal favorite, former Congresswoman and presidential candidate Michelle Bachmann congratulating the founding fathers for their great courage and fortitude in abolishing slavery.
American history is a great story in itself, and really needs no exaggeration or enhancement. It has all the drama, vivid characters, crises, and triumphs that anyone could want. Even some of the duller personalities that populate our history were interesting in their own plodding way. Maybe that’s why we crave over-the-top scenarios in fiction that portray our leaders as criminals, clowns or worse. Authors must do all they can to keep up with reality.
March 15, 2016
If I had to choose the author whose works entranced me most as a child, it would be Laura Ingalls Wilder. My fascination with Laura began in the fourth grade, when I was introduced to Little House In The Big Woods. This book was clearly intended to teach us kids who were living cushy suburban lives what it was like to grow up in a pioneer family. The books, and the seven that followed it, were all about survival and self-sufficiency in places where civilization as we know it had not yet penetrated.
The Ingalls family saga began in the Big Woods of Wisconsin, where they had to eke out a living from hunting and raising crops on small patches of cleared land among the trees, all the while fighting off bears and panthers who roamed the woods freely. When Laura was about four, the family moved on to Indian territory in what is now Kansas, in search of more fertile land. Non-Indian settlement there wasn’t strickly legal yet, according to the Federal government, and both the Feds and the natives took steps to get rid of the interlopers. Then on to Minnesota, where marauding grasshoppers destroyed the family’s crops. Tragedy struck when Laura’s older sister Mary was struck blind as the result of an illness that could not be pinpointed at the time. Although expensive doctors were called in and the bills piled up, nothing could be done for her.
When the Ingalls family moved to Dakota Territory, their final stop, they lived and worked for a while in a railroad camp, where Laura’s father Charles was the paymaster. He was threatened with beatings or worse when the pay was late. Even once the family settled on its own homestead, they dealt with one crisis after another. The weather alone could be a backbreaker. The legendary winter of 1880-81 merited a book of its own (The Long Winter). Summer tornadoes often proved just as destructive.
I didn’t realize as a child that these books were fiction. It was easy to assume that they were literal truth because their level of detail is so vivid. That is why the recent publication of Pioneer Girl: The Annotated Autobiography, edited by Pamela Smith Hill, is so intriguing. It features the original memoir that Laura wrote prior to beginning work on her series, including all her misspellings and grammatical errors, and sometimes lapses of memory. Numerous footnotes are included that explain the actual history that inspired the series, and help to separate truth from fiction.
Laura was assisted by her daughter, Rose Wilder Lane, in shaping the fictional works. Lane was an established author who had written several biographies that crossed the line between fact and fiction, angering some of her subjects. She advised her mother to use similar techniques, but without the pretense that they were straight autobiography. To make the novels more dramatic, yet suitable for children, they altered certain events, created some new and composite characters, and glossed over or excluded some of the family’s grimmest experiences. For example, the family’s sojourn in Burr Oak, Iowa, where they helped to run a hotel, was not included in the series. Presumably their proximity to a saloon, where Laura observed drunkeness and other questionable behavior when not yet a teenager, made it unsuitable for young readers. The death of Laura’s baby brother around this time was also deemed too dark an episode to deal with.
Reading the “true story” has made me aware of a more important omission that, in my opinion, prevents the books from telling the entire truth. Although the themes of self-sufficiency and resilience were genuine enough, they sidestep the fact that there were times when the family needed help from the various governments under which they lived. During their Minnesota sojourn, after the grasshoppers wiped them out, Charles Ingalls was forced to apply for assistance to feed his family. Later in Dakota Territory, after blackbirds had destroyed their corn crop, it appeared that the family’s long-cherished plan to send Mary to a college for the blind in Iowa might be finished. The fictional version of the story dramatizes Charles Ingalls’s decision to sell his heifer calf to raise the necessary funds. This would be a considerable sacrifice, setting him back at least a year in establishing his farm as a fully functioning entity. The true story, however, is that Mary participated in a program established by the Dakota territorial government to educate blind students for five years at the nearest suitable institution.
In our current polarized political climate, there seems to be scant middle ground between those who believe government is an evil force that makes people too dependent, and those who believe government can solve every problem. The moderate voices that ought to be heard are being shouted down by the loudest, rudest voices. I still love Laura and her adventures as much as I ever did. The Ingalls family indeed persevered through many trials and demonstrated great strength of character. But it would have been no shame to admit that from time to time, they and other pioneers needed the sort of helping hand that government programs could provide.
February 26, 2016
Nobody needs to be told by now that self-publishing and marketing novels is no picnic. We all knew that from the start. Some of us have been at it for more than a decade now, and it hasn’t gotten much easier. True, there is far more acceptance for our efforts than there was at first, and that’s a great development. The drawback to that, of course, is that there’s also far more competition.
The trouble with enduring truisms like “it’s no picnic” and “it never gets easier” is that there are some indie authors who are making it look easy. Although it’s still like winning the lottery, there are a handful among us who’ve mastered the art of the self-published best-seller.
How do they do it? It’s not that they have more time than the rest of us, because many are encumbered with jobs and families like “ordinary” people. It helps if the jobs are flexible and the families are understanding, but that isn’t always the case. Some of these self-sustaining authors are generous enough to explain their methods on KindleBoards and other sites. What they do requires writing fast, and writing a lot of books, often in a series. These hot-shots seem to have enough physical stamina to stay up all night if they have to in order to meet some self-imposed goal, possibly one book every two months. I’d have to guess that they’re decades younger than I am, as well as much more into currently hot genres like zombies, sci-fi, apocalyptic, and historical romance. If they’re particularly lucky or prescient, they hit on a winning formula the first time, something involving characters or a fantasy world so compelling that it only needs to be tweaked slightly in order to churn out numerous sequels. They build up a fan base that is enthusiastic enough to forgive a lack of arduous editing. That is not to suggest that just because these books are done fast means they aren’t good. If they weren’t serving a need for readers, they wouldn’t sell.
Even those authors who are making real money with their ventures are not easily satisfied. I come across plenty on the Boards who complain that they “only” sell a hundred or so a month, a result which sounds mighty good to me. In fact, selling 1,000 a year would be a pretty good result for self-publishing. It would enable most authors to cover the investment they made in advertising and printing, with maybe coffee money left over. The problem for the truly ambitious is that it’s not a living. The real measure of success among the aspiring big sellers is to be able to quit their day jobs. Or better yet, attract the notice of one of those traditional publishers who have proven themselves perfectly capable of swooping in to reap the benefits of an indie author’s preliminary hard work.
How do you pursue goals like this if your writing style doesn’t lend itself to speed? You probably can’t. I’ve always preferred mainstream fiction to genre fiction, and I like it to be “literary.” My favorite novels take their time unfolding, and emphasize character development over action. That’s what I try to emulate. I was greeted with incredulity on the Boards when I said I had taken three to four years to write each of my novels. They have numerous characters and complex plots that hopefully fall into place for a reader patient enough to stick with them. I’m still not good enough at writing to do it fast. I make outlines, but don’t stick to them. I run my stories piecemeal through a tough critique group. Even after I have a whole product, I reread it relentlessly and put it through several rounds of editing from outside critics.
So what’s your reward, if wealth and fame seem out of reach because you’re just too slow? It can only be the personal satisfaction of doing the best work you’re capable of, no matter how long it takes.
February 3, 2016
It’s a great time to aspire to be a moviemaker without any credentials whatsoever. Ambitious amateurs are proclaiming that anybody with a smart phone in his or her pocket is a potential filmmaker. Is it true that no special knowledge or skill is needed when you point that I-phone, other than the ability to hold it straight? And are the films being made with such minimal preparation any good? So far we haven’t seen the ambitious phone-wielders on a red carpet at the Academy Awards or the Golden Globes. But there have been enough breakthroughs in the past few years to give amateurs hope.
Just by googling “movies made with I-phones,” you can find numerous examples of phone-based productions that have garnered attention, a few of them enough to win prestigious prizes. For example, a movie called “Tangerine,” shot on an Apple device, was shown at the Sundance Film Festival. It is based on the true story of a love triangle that developed at a popular donut shop between a transgender woman, her boyfriend, and a biological woman.
On looking closer, it seems this production adhered to certain professional standards. The writer and director, Sean Baker, did know what he was doing. He used three phones, as well as an app called Filmic Pro, a Steadicam to keep the phones from shaking, and some adapter lenses to give it a professional look. He also employed post-production techniques that reflected his knowledge of traditional filmmaking.
There is now at least one annual festival devoted to recognizing and rewarding iPhone films. Belarus-born Chris Nong, also an established director, won an award at the second annual festival for an eight-minute Russian action movie shot with an iPhone 4. Again, other devices were used, and the director’s professional credentials were in evidence. Michael Koerbel, the producer of a TV series called “Goldilocks” that features a blonde secret agent called Jasmine, maintains that anybody can do it. He is also the author of a book called “Studio in your Pocket,” and the producer of several short films. He declares, “We want to inspire the next generation of filmmakers to get out there and start sharing their stories with the world.”
How about full-length feature films? “Uneasy Lies The Mind” (2014) was billed as “the first narrative feature film to be shot entirely on iPhone.” This film is a psychological portrait of a man suffering delusions due to a head injury. Accordingly, its use of distorted and disjointed images is actually a selling point. The director, Ricky Fosheim, the founder of Detention Films and known for his music videos, pointed to the relative affordability of this method.
So is everybody really doing it? These experienced directors give the impression that they are experimenting with ways of cutting costs and getting a production up and running with amazing speed, but that they could return to their more traditional and expensive methods at any time.
What about absolute rank amateurs? Are they doing anything noteworthy? Maybe not yet, but they are trying. There are numerous meetup groups here in the DC area devoted to writing scripts and critiquing them. However, if a movie is ever to arise from a script, it has to be “crewed.” That is true whether the filmmakers make use of their handy personal toys or bring in traditional cameras. You either need to hire an existing production company, which is an expensive proposition, or put together an amateur one.
The “Film in a Day” method is an increasingly popular and relatively affordable technique for ambitious but under-subsidized outfits. For example, a meetup group called Bethesda Amateur Filmmakers A to Z, located in suburban Washington DC, proclaims: “Writing, producing, directing, acting, filming, and editing, we do it all!” Founded in March of 2015, the group has two “executive organizers” in charge of all productions. They periodically send out a call for screenplays of five to seven pages, from which they aim to select one for production every two months. Once the script is selected, they put together a temporary production company, locate a single set, and accomplish the shooting in one day. Four films have been made up to this point, three to five minutes in length, and posted on youtube. They range from a comedy about bumbling thieves (“Decaf”) to a psychological fantasy about conquering internal demons (“Critics”). By necessity, the story lines and messages are simple, yet five minutes seems enough time to at least make a point. It’s not red-carpet stuff, but it’s a start.
January 5, 2016
I first encountered Sylvia Plath’s The Bell Jar in 1972, when I was a sophomore in college. It was not assigned reading at that time, yet it was catching on like wildfire, especially among us young English majors. Apparently the novel was having a similar effect on many other college campuses. It was originally published in England, Plath’s adopted homeland, only a few weeks before her suicide in 1963. She had used a pseudonym out of belated concern for the many people close to her whom she had trashed mercilessly in the autobiographical story.
Plath was reportedly disappointed in the tepid reaction to the novel. Her only previous book, a collection of poems, had suffered a similar underwhelming fate. She had recently separated from her husband, the poet Ted Hughes, who at that time was much better known than she was. Motivated by both pride and desperation, she was trying to find a way to support herself and their two children. American publishers were initially skeptical about the book’s salability, and she was unable to get it accepted by a U.S. publisher during her lifetime. Several years later, when imported bootleg copies began selling by the hundreds in bookstores, The Bell Jar finally caught the eye of the so-called American literary experts.
Having reread it recently, I can see what put publishers off. It details a nervous breakdown suffered by a young, talented college student. Plath’s forte was poetry, and it shows. The novel reads like the effusions of a poet trying to write a novel. It features a plethora of metaphors, which make for lovely writing but at times can look like showing off. Apart from this stylistic problem, the story suffers from something of a disconnect. As pointed out by one of the publishers who turned it down, the breakdown doesn’t seem to follow from the ordinary angst of a teenaged girl. The observations of a perceptive young woman, who’s going through a tumultuous time in her life, don’t prepare the reader for her plunge into suicidal depression.
Yet something about Plath’s novel certainly spoke to us young college girls. What brought it to life was that by the early 1970s, we knew it was chillingly real. Plath had indeed tried to commit suicide at 20 years of age, and she succeeded at it when she was 30. Like her heroine, Esther Greenwood, she was a scholarship girl at a prominent Eastern women’s college in 1953, who won a writing contest that entitled her to spend a month working at a New York-based fashion magazine. Like her character, Plath was beset by overwhelming ambition that was essentially stymied for girls growing up in the 1950s. She wanted both personal happiness and professional success. The magazine job turned out to be tedious and unsatisfying. She had a boyfriend who wanted to marry her, but who assured her with complete certitude that once she had kids, her creative life would become irrelevant. When she returned to suburbia from her New York adventure, everything seemed lifeless. Her mother’s van reminded her of a prison. The neighbors struck her as nosy and dowdy.
The college years can be a tough period of self-discovery and fear for the future. I hardly knew anyone in those times, including myself, who didn’t go through an episode or two of depression. Fortunately, few of us crashed as dramatically as Plath did. Yet in her later life, Plath came tantalizingly close to fulfilling that “having it all” goal. In fact, at the start of The Bell Jar, she almost brags about it. For a long time after her breakdown, Esther says, she hid away the gifts she accumulated in New York from various fashion companies, such as sunglasses and makeup cases. But “when I was all right again,” she brought them out, and “cut the plastic starfish off the sunglasses case for the baby to play with.”
For a while her marriage was almost idyllic, at least on the surface. She and Hughes took turns caring for their child while each managed several hours of writing time each day. But unfortunately for Sylvia, she married a man with a roving eye. The marriage seemed to grow more troubled after the birth of a second child, which made their childcare chores more complicated. However, her final breakdown was not triggered by her separation from Hughes. In fact, that trauma inspired her to write the anger-fueled poems that became Ariel, the collection which made her name. A more likely explanation is that the publication of The Bell Jar tipped her back into the adolescent angst that she thought she had escaped.
The seeds of self-destruction were always there, regardless of her circumstances. “Sylvia was doomed,” remarked her high school English teacher when he heard of her suicide. Even when she had posed as a fun-loving, carefree high school girl, he had detected the rigidity and falseness behind that sunny mask. It’s noteworthy that there was a history of depression on both sides of her family. She was able to make art from her illness, but the more prosaic truth was that she was mishandled by the psychiatric profession. That is one of the messages of The Bell Jar. Effective treatments for her condition either did not exist or were in an early stage of development. She became something of a guinea pig for drug regimens and electroshock therapy. So I conclude that Sylvia Plath speaks to us, but not for us. We understand her struggles, but most of us, thankfully, can’t begin to understand the desperate remedy that she seized.