June 2, 2016
Traditional publishers will probably never embrace independent authors as equals. They will be loath to admit that the terms of engagement in this ongoing battle are changing, that the combatants are becoming more equal, and that some authors even find a way to go “hybrid.” It’s becoming increasingly clear that the trads are losing the high ground they once held in the area of editorial standards.
Examples of bad editing crop up more and more in the traditional world. For example, there are few authors more successful at traditional publishing than Anne Rice. She also specializes in the hottest subjects in fiction, vampires and werewolves. Yet Floyd Orr, editor of the long-running review site PODBRAM, and a rabid Rice fan, reports: “Anne Rice’s 34th book contains more errors than I have ever seen in a top-selling, traditionally published hardback! There are errors of every kind: repeated common words, misused spellings of words that are real words that actually mean something else, misuse of tense, and various other types of boo-boos. What do these errors all have in common? They are the sort that appear in books because human eyes did not read and reread and proofread the text before publishing it. There was an obvious reliance on computer programs to find the errors. Was this by Ms. Rice, her editor, or Knopf in general? Who knows?” Floyd kindly goes on to point out that the error count of Rice’s book easily surpasses those of several of the self-published books he has reviewed, including my own Handmaidens of Rock.
Trads were guilty from the start of not fighting this war honestly, but things have progressed to the point that self-published authors don’t have to suffer the same nonsense anymore. They can take or leave “friendly advice” from self-appointed arbiters of what deserves to be published. No doubt these experts will persist in warning us against “vanity” publishers, a term that should have been deep-sixed years ago. We can now call out websites that masquerade as help for the self-published, but are actually designed to discourage us. Certainly there are bad self-published books, but the argument that we’re all equally bad doesn’t hold water, any more than the argument that traditional publishing guarantees quality.
Several years ago, I sent my 2007 novel, The Rock Star’s Homecoming, to a site called “The Self-Publishing Review,” a blog run by an author who’d had a fair amount of success in publishing non-fiction. Some speculated that her generic-sounding name might be a pseudonym to protect herself from backlash. Certainly the name of her blog was misleading. Once I had read a sampling of her “reviews,” it became clear to me that these were something else altogether. By any fair standard, a reader who purports to provide a review must, at the very least, read the book. Her object was to throw cold water on authors by subjecting them to the kind of treatment they would receive if they sent their manuscripts to a “legitimate” publisher. Admittedly, that might be a useful service, but it was not what she advertised.
To be fair, she warned us: “I’m an editor, and expect published books to be polished. I’m going to count all the errors I find in spelling, punctuation and grammar and when I reach fifteen I’m going to stop reading. I’ll work my way through up to five pages of boring prose or bad writing before I give up.” Despite that stern warning, I felt okay about sending her my novel, although it had to be shipped overseas at some expense. I’ve been something of an editor myself during many years of technical writing for the Federal government. I knew I had gone over my novel carefully and that it had been edited by professionals.
My book, like almost every other that this hot-shot editor “reviewed,” was discarded after about seven pages because of alleged mistakes. I was sure there were not fifteen errors of the type she warned against in the whole book, much less in the first seven pages. When I asked for an explanation, she admitted that there was nothing wrong with my “spelling, punctuation and grammar” per se. My sin was “exposition,” apparently a common complaint against self-published authors, and a handy one if the arbiters can’t find more obvious mistakes.
What does this sin consist of, exactly? Wikipedia defines exposition as “the insertion of important background information within a story; for example, information about the setting, characters’ backstories, prior plot events, historical context, etc.” The article quotes fantasy and science fiction author Jo Walton on the importance of “scattering information seamlessly through the text, as opposed to stopping the story to impart the information.”
My problem with this criticism, legitimate though it might be, is that famous authors do it with impunity. I pointed out that two of my favorites, Pat Conroy and Gail Godwin, tend to not even start their stories until the scene is thoroughly set. If any arbiter tried to impose rules on them, about exposition or anything else, they’d laugh in that person’s face. Ah, the arbiters say, but there’s a right way and a wrong way to do it. All I conclude from this is that it’s always wrong when self-published authors do it.
What about the credentials of these arbiters? Despite their successes in the non-fiction realm, they tend to be sitting on piles of unpublished novels like everyone else. Ironically, that’s where they’re offering their harshest criticism. Since self-publishing is for losers, they disdain that route—although they might admit to putting excerpts of their novels on the Internet, as if that were not a form of self-publishing.
We’ve all heard plenty of those traditional “success stories,” touting the efforts of authors who kept writing and rewriting the same story for fifteen or twenty years, submitting it to numerous agents and publishers, revising and starting over to suit each new critic, perhaps even trying to re-envision their stories as plays or screenplays. Sometimes two decades of effort and perseverance are indeed “rewarded,” but that’s not my idea of success. How many other stories could these authors have been writing during those endless years spent twisting their original vision a hundred different ways to suit one critic after another? Was the original inspiration even recognizable by then? Fortunately, no one has to settle for this kind of treatment any more. The fight rages on, with one of the combatants, in my opinion, looking increasingly desperate.
May 2, 2016
I’ve spent my entire life living in a suburban cocoon, sheltered from the world’s harshest realities. I always knew that the famines, decades-long civil wars, and military coups that regularly decimate foreign countries can’t possibly happen here. Lucky me, I was born in the United States in a time of relative prosperity, although the political landscape has never been what you could call tranquil. I’m a baby boomer, by definition the child of a World War Two veteran. The Greatest Generation, my parents’ generation, fought the most virulent forms of Fascism in Germany and Japan to ensure that those scourges couldn’t invade our lives. True, we lived our entire lives under a nuclear cloud, practicing futile remedies like duck-and-cover when we were kids, but we could count on Mutually Assured Destruction to keep us safe. It seemed the Soviets, like us, weren’t totally crazy.
There have been many books and movies that plausibly envision all sorts of nightmare scenarios. Some of the “what-ifs” that have made the greatest impression on me include 1984 (what a high-tech totalitarian society would look like), It Happened Here (if England had lost the Battle of Britain and been conquered by Hitler), Seven Days In May (if the US military attempted to overthrow the president), and most chilling of all, Level Seven (total nuclear annihilation). After imagining the worst, I feel relieved that it hasn’t happened yet. I have an urge to step outside and breathe in the sights and sounds of my own lawn, where life persists, unaware of any existential threat.
Nobody in the US could claim at any time that Fascism had been totally defeated. It has always been present, at least beneath the surface, in our national political life. Lately, it has begun to get alarmingly obvious. It’s not necessarily a good sign that we see more and more fictional presidents who are either totalitarian wannabes or buffoons. I was criticized for portraying an over-the-top president in my own novel, Let’s Play Ball, but he was small potatoes. All he did was have adulterous sex in the oval office (real sex, not just oral) and hatch a plot to kidnap a ballplayer. What I wrote can’t hold a candle to appalling but undeniably entertaining shows like “House of Cards.”
How believable is Frank Underwood, the fictional president of this series? As of this date, he’s already murdered two people by his own hand, and has an equally thuggish chief of staff doing dirty work for him on the side. In one recent scene with his own Secretary of Defense, a potential political rival, he seems to confess to his previous murders and threaten her with the same fate, only to back off and say he’s kidding. As he obviously intends, she is left unsure whether she’s really in danger or just paranoid. Yet if Underwood were real, I would vote for him over several of the current presidential candidates. I’d even vote for his wife Claire, who has maneuvered herself into a spot on his ticket. The Underwoods at least take some reasonable positions, if only for expediency’s sake.
When I studied Political Science in graduate school over thirty years ago, I thought I had acquired a decent grasp of “what can’t happen here” in the political realm. But in 2016, we might as well shred those rules. There are practically no limits now to what certain candidates can say or do and remain beloved by their fans. I suspect that deep down, many of these politicians know how unreasonable their positions are, but they have gotten into the habit of pandering to an uninformed electorate instead of trying to educate their followers. Since there’s no point in agonizing over what I can’t control, I amuse myself these days by pretending that the ongoing election is a novel, and the candidates are colorful if implausible characters. Only what self-respecting editor wouldn’t red-ink characters like these? Honestly, I don’t think I have enough imagination to make up Donald Trump.
When has such an ignorant buffoon come so close to the presidency, even in fiction? Charlie Chaplin’s 1940 film “The Great Dictator” satirized characters such as Adenoid Hynkel of Tomainia and Benzino Napaloni of Bacteria, who tragically had real-life counterparts. Is the present situation so different? Here is a man who accepts Nazi-like salutes from his followers, has expressed admiration for Mussolini, has reportedly studied Hitler’s speeches, and encourages his goons to beat up any detractors. He refuses to repudiate the support of white supremacists, and threatens media outlets that criticize him. But let’s jump ahead and contemplate what President Trump would be like. Since candidate Trump has yet to show the slightest grasp of how the US government works, we must assume he would expect to enter office wielding dictatorial powers. How would he react when he discovers the concept of checks and balances? Michael Hayden, former NSA and CIA director, has said that some of the orders Trump intends to issue as commander-in-chief would be illegal, and could well trigger a coup. His signature policy initiative would start at least a trade war with Mexico, our largest trading partner, if not a hot war. Meanwhile, if he follows through on his promise to deport 11 million immigrants, he would create all the necessary ingredients for a civil war.
Would Trump last even a year in office? I can see him quitting in frustration when he finds the job more difficult than he imagined. Certainly some of his actions, if he tried to carry them out, would be impeachable. Like Chaplin’s dictators, he’s extremely childish, given to temper tantrums if he doesn’t receive the adulation he thinks he deserves or otherwise fails to get his way. Imagine putting someone with the temperament of a five-year-old in charge of the military and the nuclear codes. He might blow us off the map before we had a chance to impeach him.
This is really nothing new for Republican candidates, many of whom have demonstrated an appalling ignorance of our national history. Trump himself was recently asked in an in-depth interview what he thought of Lincoln’s accomplishments. After he had rambled for several minutes, it was clear to the interviewers that he had no idea what Lincoln did. He touts an “America First” foreign policy, showing no awareness of what that phrase meant in the late 1930s. Sarah Palin, former Republican VP candidate, didn’t know that World Wars One and Two were fought in the twentieth century. And then there’s my personal favorite, former Congresswoman and presidential candidate Michelle Bachmann congratulating the founding fathers for their great courage and fortitude in abolishing slavery.
American history is a great story in itself, and really needs no exaggeration or enhancement. It has all the drama, vivid characters, crises, and triumphs that anyone could want. Even some of the duller personalities that populate our history were interesting in their own plodding way. Maybe that’s why we crave over-the-top scenarios in fiction that portray our leaders as criminals, clowns or worse. Authors must do all they can to keep up with reality.
March 15, 2016
If I had to choose the author whose works entranced me most as a child, it would be Laura Ingalls Wilder. My fascination with Laura began in the fourth grade, when I was introduced to Little House In The Big Woods. This book was clearly intended to teach us kids who were living cushy suburban lives what it was like to grow up in a pioneer family. The books, and the seven that followed it, were all about survival and self-sufficiency in places where civilization as we know it had not yet penetrated.
The Ingalls family saga began in the Big Woods of Wisconsin, where they had to eke out a living from hunting and raising crops on small patches of cleared land among the trees, all the while fighting off bears and panthers who roamed the woods freely. When Laura was about four, the family moved on to Indian territory in what is now Kansas, in search of more fertile land. Non-Indian settlement there wasn’t strickly legal yet, according to the Federal government, and both the Feds and the natives took steps to get rid of the interlopers. Then on to Minnesota, where marauding grasshoppers destroyed the family’s crops. Tragedy struck when Laura’s older sister Mary was struck blind as the result of an illness that could not be pinpointed at the time. Although expensive doctors were called in and the bills piled up, nothing could be done for her.
When the Ingalls family moved to Dakota Territory, their final stop, they lived and worked for a while in a railroad camp, where Laura’s father Charles was the paymaster. He was threatened with beatings or worse when the pay was late. Even once the family settled on its own homestead, they dealt with one crisis after another. The weather alone could be a backbreaker. The legendary winter of 1880-81 merited a book of its own (The Long Winter). Summer tornadoes often proved just as destructive.
I didn’t realize as a child that these books were fiction. It was easy to assume that they were literal truth because their level of detail is so vivid. That is why the recent publication of Pioneer Girl: The Annotated Autobiography, edited by Pamela Smith Hill, is so intriguing. It features the original memoir that Laura wrote prior to beginning work on her series, including all her misspellings and grammatical errors, and sometimes lapses of memory. Numerous footnotes are included that explain the actual history that inspired the series, and help to separate truth from fiction.
Laura was assisted by her daughter, Rose Wilder Lane, in shaping the fictional works. Lane was an established author who had written several biographies that crossed the line between fact and fiction, angering some of her subjects. She advised her mother to use similar techniques, but without the pretense that they were straight autobiography. To make the novels more dramatic, yet suitable for children, they altered certain events, created some new and composite characters, and glossed over or excluded some of the family’s grimmest experiences. For example, the family’s sojourn in Burr Oak, Iowa, where they helped to run a hotel, was not included in the series. Presumably their proximity to a saloon, where Laura observed drunkeness and other questionable behavior when not yet a teenager, made it unsuitable for young readers. The death of Laura’s baby brother around this time was also deemed too dark an episode to deal with.
Reading the “true story” has made me aware of a more important omission that, in my opinion, prevents the books from telling the entire truth. Although the themes of self-sufficiency and resilience were genuine enough, they sidestep the fact that there were times when the family needed help from the various governments under which they lived. During their Minnesota sojourn, after the grasshoppers wiped them out, Charles Ingalls was forced to apply for assistance to feed his family. Later in Dakota Territory, after blackbirds had destroyed their corn crop, it appeared that the family’s long-cherished plan to send Mary to a college for the blind in Iowa might be finished. The fictional version of the story dramatizes Charles Ingalls’s decision to sell his heifer calf to raise the necessary funds. This would be a considerable sacrifice, setting him back at least a year in establishing his farm as a fully functioning entity. The true story, however, is that Mary participated in a program established by the Dakota territorial government to educate blind students for five years at the nearest suitable institution.
In our current polarized political climate, there seems to be scant middle ground between those who believe government is an evil force that makes people too dependent, and those who believe government can solve every problem. The moderate voices that ought to be heard are being shouted down by the loudest, rudest voices. I still love Laura and her adventures as much as I ever did. The Ingalls family indeed persevered through many trials and demonstrated great strength of character. But it would have been no shame to admit that from time to time, they and other pioneers needed the sort of helping hand that government programs could provide.
February 26, 2016
Nobody needs to be told by now that self-publishing and marketing novels is no picnic. We all knew that from the start. Some of us have been at it for more than a decade now, and it hasn’t gotten much easier. True, there is far more acceptance for our efforts than there was at first, and that’s a great development. The drawback to that, of course, is that there’s also far more competition.
The trouble with enduring truisms like “it’s no picnic” and “it never gets easier” is that there are some indie authors who are making it look easy. Although it’s still like winning the lottery, there are a handful among us who’ve mastered the art of the self-published best-seller.
How do they do it? It’s not that they have more time than the rest of us, because many are encumbered with jobs and families like “ordinary” people. It helps if the jobs are flexible and the families are understanding, but that isn’t always the case. Some of these self-sustaining authors are generous enough to explain their methods on KindleBoards and other sites. What they do requires writing fast, and writing a lot of books, often in a series. These hot-shots seem to have enough physical stamina to stay up all night if they have to in order to meet some self-imposed goal, possibly one book every two months. I’d have to guess that they’re decades younger than I am, as well as much more into currently hot genres like zombies, sci-fi, apocalyptic, and historical romance. If they’re particularly lucky or prescient, they hit on a winning formula the first time, something involving characters or a fantasy world so compelling that it only needs to be tweaked slightly in order to churn out numerous sequels. They build up a fan base that is enthusiastic enough to forgive a lack of arduous editing. That is not to suggest that just because these books are done fast means they aren’t good. If they weren’t serving a need for readers, they wouldn’t sell.
Even those authors who are making real money with their ventures are not easily satisfied. I come across plenty on the Boards who complain that they “only” sell a hundred or so a month, a result which sounds mighty good to me. In fact, selling 1,000 a year would be a pretty good result for self-publishing. It would enable most authors to cover the investment they made in advertising and printing, with maybe coffee money left over. The problem for the truly ambitious is that it’s not a living. The real measure of success among the aspiring big sellers is to be able to quit their day jobs. Or better yet, attract the notice of one of those traditional publishers who have proven themselves perfectly capable of swooping in to reap the benefits of an indie author’s preliminary hard work.
How do you pursue goals like this if your writing style doesn’t lend itself to speed? You probably can’t. I’ve always preferred mainstream fiction to genre fiction, and I like it to be “literary.” My favorite novels take their time unfolding, and emphasize character development over action. That’s what I try to emulate. I was greeted with incredulity on the Boards when I said I had taken three to four years to write each of my novels. They have numerous characters and complex plots that hopefully fall into place for a reader patient enough to stick with them. I’m still not good enough at writing to do it fast. I make outlines, but don’t stick to them. I run my stories piecemeal through a tough critique group. Even after I have a whole product, I reread it relentlessly and put it through several rounds of editing from outside critics.
So what’s your reward, if wealth and fame seem out of reach because you’re just too slow? It can only be the personal satisfaction of doing the best work you’re capable of, no matter how long it takes.
February 3, 2016
It’s a great time to aspire to be a moviemaker without any credentials whatsoever. Ambitious amateurs are proclaiming that anybody with a smart phone in his or her pocket is a potential filmmaker. Is it true that no special knowledge or skill is needed when you point that I-phone, other than the ability to hold it straight? And are the films being made with such minimal preparation any good? So far we haven’t seen the ambitious phone-wielders on a red carpet at the Academy Awards or the Golden Globes. But there have been enough breakthroughs in the past few years to give amateurs hope.
Just by googling “movies made with I-phones,” you can find numerous examples of phone-based productions that have garnered attention, a few of them enough to win prestigious prizes. For example, a movie called “Tangerine,” shot on an Apple device, was shown at the Sundance Film Festival. It is based on the true story of a love triangle that developed at a popular donut shop between a transgender woman, her boyfriend, and a biological woman.
On looking closer, it seems this production adhered to certain professional standards. The writer and director, Sean Baker, did know what he was doing. He used three phones, as well as an app called Filmic Pro, a Steadicam to keep the phones from shaking, and some adapter lenses to give it a professional look. He also employed post-production techniques that reflected his knowledge of traditional filmmaking.
There is now at least one annual festival devoted to recognizing and rewarding iPhone films. Belarus-born Chris Nong, also an established director, won an award at the second annual festival for an eight-minute Russian action movie shot with an iPhone 4. Again, other devices were used, and the director’s professional credentials were in evidence. Michael Koerbel, the producer of a TV series called “Goldilocks” that features a blonde secret agent called Jasmine, maintains that anybody can do it. He is also the author of a book called “Studio in your Pocket,” and the producer of several short films. He declares, “We want to inspire the next generation of filmmakers to get out there and start sharing their stories with the world.”
How about full-length feature films? “Uneasy Lies The Mind” (2014) was billed as “the first narrative feature film to be shot entirely on iPhone.” This film is a psychological portrait of a man suffering delusions due to a head injury. Accordingly, its use of distorted and disjointed images is actually a selling point. The director, Ricky Fosheim, the founder of Detention Films and known for his music videos, pointed to the relative affordability of this method.
So is everybody really doing it? These experienced directors give the impression that they are experimenting with ways of cutting costs and getting a production up and running with amazing speed, but that they could return to their more traditional and expensive methods at any time.
What about absolute rank amateurs? Are they doing anything noteworthy? Maybe not yet, but they are trying. There are numerous meetup groups here in the DC area devoted to writing scripts and critiquing them. However, if a movie is ever to arise from a script, it has to be “crewed.” That is true whether the filmmakers make use of their handy personal toys or bring in traditional cameras. You either need to hire an existing production company, which is an expensive proposition, or put together an amateur one.
The “Film in a Day” method is an increasingly popular and relatively affordable technique for ambitious but under-subsidized outfits. For example, a meetup group called Bethesda Amateur Filmmakers A to Z, located in suburban Washington DC, proclaims: “Writing, producing, directing, acting, filming, and editing, we do it all!” Founded in March of 2015, the group has two “executive organizers” in charge of all productions. They periodically send out a call for screenplays of five to seven pages, from which they aim to select one for production every two months. Once the script is selected, they put together a temporary production company, locate a single set, and accomplish the shooting in one day. Four films have been made up to this point, three to five minutes in length, and posted on youtube. They range from a comedy about bumbling thieves (“Decaf”) to a psychological fantasy about conquering internal demons (“Critics”). By necessity, the story lines and messages are simple, yet five minutes seems enough time to at least make a point. It’s not red-carpet stuff, but it’s a start.
January 5, 2016
I first encountered Sylvia Plath’s The Bell Jar in 1972, when I was a sophomore in college. It was not assigned reading at that time, yet it was catching on like wildfire, especially among us young English majors. Apparently the novel was having a similar effect on many other college campuses. It was originally published in England, Plath’s adopted homeland, only a few weeks before her suicide in 1963. She had used a pseudonym out of belated concern for the many people close to her whom she had trashed mercilessly in the autobiographical story.
Plath was reportedly disappointed in the tepid reaction to the novel. Her only previous book, a collection of poems, had suffered a similar underwhelming fate. She had recently separated from her husband, the poet Ted Hughes, who at that time was much better known than she was. Motivated by both pride and desperation, she was trying to find a way to support herself and their two children. American publishers were initially skeptical about the book’s salability, and she was unable to get it accepted by a U.S. publisher during her lifetime. Several years later, when imported bootleg copies began selling by the hundreds in bookstores, The Bell Jar finally caught the eye of the so-called American literary experts.
Having reread it recently, I can see what put publishers off. It details a nervous breakdown suffered by a young, talented college student. Plath’s forte was poetry, and it shows. The novel reads like the effusions of a poet trying to write a novel. It features a plethora of metaphors, which make for lovely writing but at times can look like showing off. Apart from this stylistic problem, the story suffers from something of a disconnect. As pointed out by one of the publishers who turned it down, the breakdown doesn’t seem to follow from the ordinary angst of a teenaged girl. The observations of a perceptive young woman, who’s going through a tumultuous time in her life, don’t prepare the reader for her plunge into suicidal depression.
Yet something about Plath’s novel certainly spoke to us young college girls. What brought it to life was that by the early 1970s, we knew it was chillingly real. Plath had indeed tried to commit suicide at 20 years of age, and she succeeded at it when she was 30. Like her heroine, Esther Greenwood, she was a scholarship girl at a prominent Eastern women’s college in 1953, who won a writing contest that entitled her to spend a month working at a New York-based fashion magazine. Like her character, Plath was beset by overwhelming ambition that was essentially stymied for girls growing up in the 1950s. She wanted both personal happiness and professional success. The magazine job turned out to be tedious and unsatisfying. She had a boyfriend who wanted to marry her, but who assured her with complete certitude that once she had kids, her creative life would become irrelevant. When she returned to suburbia from her New York adventure, everything seemed lifeless. Her mother’s van reminded her of a prison. The neighbors struck her as nosy and dowdy.
The college years can be a tough period of self-discovery and fear for the future. I hardly knew anyone in those times, including myself, who didn’t go through an episode or two of depression. Fortunately, few of us crashed as dramatically as Plath did. Yet in her later life, Plath came tantalizingly close to fulfilling that “having it all” goal. In fact, at the start of The Bell Jar, she almost brags about it. For a long time after her breakdown, Esther says, she hid away the gifts she accumulated in New York from various fashion companies, such as sunglasses and makeup cases. But “when I was all right again,” she brought them out, and “cut the plastic starfish off the sunglasses case for the baby to play with.”
For a while her marriage was almost idyllic, at least on the surface. She and Hughes took turns caring for their child while each managed several hours of writing time each day. But unfortunately for Sylvia, she married a man with a roving eye. The marriage seemed to grow more troubled after the birth of a second child, which made their childcare chores more complicated. However, her final breakdown was not triggered by her separation from Hughes. In fact, that trauma inspired her to write the anger-fueled poems that became Ariel, the collection which made her name. A more likely explanation is that the publication of The Bell Jar tipped her back into the adolescent angst that she thought she had escaped.
The seeds of self-destruction were always there, regardless of her circumstances. “Sylvia was doomed,” remarked her high school English teacher when he heard of her suicide. Even when she had posed as a fun-loving, carefree high school girl, he had detected the rigidity and falseness behind that sunny mask. It’s noteworthy that there was a history of depression on both sides of her family. She was able to make art from her illness, but the more prosaic truth was that she was mishandled by the psychiatric profession. That is one of the messages of The Bell Jar. Effective treatments for her condition either did not exist or were in an early stage of development. She became something of a guinea pig for drug regimens and electroshock therapy. So I conclude that Sylvia Plath speaks to us, but not for us. We understand her struggles, but most of us, thankfully, can’t begin to understand the desperate remedy that she seized.
December 5, 2015
These days I feel an urge to occupy something. As a progressive from the school of aging baby boomers, I find the current political climate and level of discourse in the US increasingly scary. As far back as I can remember, political institutions have never been as dysfunctional as they are now. We baby boomers have a tendency to exaggerate our exploits and insist that we used to be more astute and involved than today’s kids. Back in our day, we stopped the Vietnam War, invented civil rights and women’s liberation, pulled off Woodstock, and accomplished much of this while half-stoned. My Republican parents tried to steer my brother and me toward their brand of conservatism, but it didn’t work. The “Greatest Generation” and its values were just too different.
My parents’ party has now gone off the rails, as they would agree if they were still around. The two front runners for the 2016 presidential nomination as of this date are astoundingly unqualified for high office. The more childish and bizarre their pronouncements, the more their fan base cheers. Worse, they’ve managed to intimidate more mainstream Republican candidates into adopting equally crazy or demagogic positions. Listening to these gentlemen debate, I wait in vain for the rare reasonable statement based on verifiable facts, or a policy proposal that could actually be implemented, or even a message that isn’t hate-filled venom. That is a very low bar for our national politics.
It’s a relief to have a forum where I can state my beliefs plainly, but it’s not a good technique for writing fiction. Since my stories tend to harken back to my youth, politics has a way of sneaking into them. Critics justifiably warn us of the dangers of turning what should be entertaining stories into polemics. Two of my novels feature fictional presidents who are corrupt and bellicose, and are obviously Republicans. Still, they don’t hold a candle to the real-life buffoons of this day and age. You couldn’t make up candidates like Trump and Carson. It’s even getting difficult for comedians to satirize them, as the reality almost matches the caricature. My writing inevitably reflects my beliefs and career experiences from over 40 years in government and quasi-government, but it’s best to keep these things understated while telling a story. I prefer to think I’m standing up not for a particular candidate or platform, but for reason and compassion.
My 2003 novel, Secretarial Wars, was inspired by my first permanent job after college. I spent more than five years during the 1970s at the Fulbright grants program, an international exchange program for scholars. My novel describes an agency called, somewhat ironically, the Peace Council. It’s an organization that awards grants to send professors and researchers overseas to disseminate American values. My heroine, Miriam, is a secretary at the Council and an aspiring investigative journalist on the side. She suspects that the program is serving to mask a corrupt administration’s interference with the political and economic systems of certain vulnerable nations.
Nothing like this ever happened in real life, to my knowledge. But it could have, if an evil deputy director got into bed, literally and politically, with an evil President. Miriam tries to gather enough evidence to write an explosive article for an underground rag, but she is hampered by her conflicting desire to advance in the organization, as well as her unhealthy attraction to the lecherous newspaper editor. One reader who critiqued Secretarial Wars thought the corrupt president was inspired by George W. Bush. It’s true the book was published during W’s term, but it took so long to write that the era it depicts more closely resembles his dad’s.
In Let’s Play Ball (2010), I mixed up sports and politics, to the confusion and disapproval of some critics. The story centers on fraternal twin sisters Jessica and Miranda, baseball fans since childhood, close but competitive in their personal relationship. Jessica is the founder and editor of an innovative sports magazine, while Miranda has a more traditional but important job as a bureaucrat in the Department of Homeland Security. While they share a liberal outlook, Miranda accuses Jessica of taking her beliefs to an extreme, especially when the intense reporter sets out to investigate her suspicions of racism on the local baseball team. Jessica’s Cuban-born fiancé, the right fielder, is soon to be a free agent, and she fears he won’t get the contract offer he deserves from the biased owners. Then her world blows apart when he is kidnapped from his own ballpark after a season-ending game. Now she envisions a vast criminal conspiracy in which the team owner and his daughter are complicit.
My astute critique group accused me of using Jessica to lecture my readers about the insidiousness of racism. I was preaching to the choir in that group anyway, they pointed out. But how can that be, I protested, when Miranda is the viewpoint character, and she rolls her eyes whenever Jessica gets too strident for her? Furthermore, Miranda is friendly with a few of the teammates whom Jessica has pegged as racists, and is having an affair with one of them. Even so, my friendly readers insisted, we can hear your political voice bellowing through.
Politics turned out to be unavoidable in Handmaidens of Rock (2014), my tale of a young musical trio and its groupies. I tried to recreate the turbulent era of my high school and college days, the late 1960s and early 1970s. Wherever their budding careers take them, the musicians can’t escape the threat of a military draft. Scared and confused, they write and perform both peace-and-love and militant songs. The threat of violence follows them, and real bombs go off around them. This was an era when radical leftists co-opted the antiwar movement with their bombings and crime sprees, giving all of us who protested the war a bad name.
I recently finished reading Days of Rage (2015), Bryan Burrough’s fascinating account of the political violence that permeated that era. He quoted at length Joseph Conner, whose father Frank, a 33-year-old banker, was killed in the infamous Fraunces Tavern bombing by Puerto Rican radicals. The younger Conner deplores current efforts to rehabilitate some of the self-styled revolutionaries of that era on the grounds that they’ve lived exemplary lives since then. “To think that America thinks none of this ever happened, that it’s not even remembered, it’s astounding to me. You know, I blame the media. The media was more than happy to let all this go. These were not the kinds of terrorists the liberal media wanted us to remember, because they share a lot of the same values. They were terrorists. They were just the wrong brand. My father was murdered by the wrong politics. By leftists. So they were let off the hook.”
I agree with Joseph Conner up to a point. The bombers and bank robbers of that era were indeed terrorists. But I disagree with his assertion that liberals are incapable of calling these criminals by their right name, when I know many of us do. I’d like to see more right-wingers who are equally capable of condemning the bombers of abortion clinics. Political messages delivered with hate lose any high ground they ever had, and become more pernicious than the wrongs they claim to be fighting.
November 10, 2015
Ever since I can remember, I wanted to be a fiction writer. I’ve always preferred making things up to dealing in realities. However, once I grew up I had bills to pay, so I needed to find practical uses for my writing skills in various workplaces. My efforts weren’t always welcome, especially when I was starting out. Back in the dark ages, most employers just wanted you to type, and not worry your “pretty little head” about what you were typing. I tried to dramatize that phenomenon in my 2003 novel, Secretarial Wars
Eventually, I wound up as a budget analyst for the Department of Labor. Federal agencies usually submit at least three versions of their annual budgets during the course of each fiscal year. These documents must present an effective mixture of numbers and narratives to justify the agency’s continuing existence as well as to request funding for new projects. Some budget analysts specialize in numbers-crunching, and some are better at explaining what the numbers mean. In my experience, the numbers specialists are more respected, but they can’t get along without the writers, even if they think they can.
I enjoyed budget-creating most when I was still young and idealistic. When I arrived at Labor in the early 1980s, I wholeheartedly believed in the department’s mission to uplift and protect the workers of America. Some administrations were resistant to those goals, but the challenge of finding ways to carry out the mission while enduring hostile cuts was satisfying in its own way. One of the highlights of my career took place at a hearing on Capitol Hill when an opening statement I had written was read, word-for-word, by the agency head. The supervisor I had then was proud of me and had my back. I only realized later what a rare gem he was.
There’s a reason why workplace comedies like “The Office” resonate. Supervisors and managers are an easy target for satire, since few can resist abusing what power they have. With a few well-publicized exceptions, most higher-ups in the Federal government never get disciplined because they generally refrain from blatantly illegal acts. But borderline unethical behavior, as well as plain bad judgment, are pretty rampant. I’ve seen managers form cliques with their favorite employees (who may nevertheless badmouth them behind their backs), take dubious junkets at taxpayer expense, and hire the people they want while skirting proper hiring procedures. Sometimes the office is junior high all over again, and other times it’s like society at large, where the one percent who already have everything get all the promotions and perks. Yet jobs that involve writing seem to be coveted. I was always fighting off newcomers and interns who were brought in to try their hand at doing my job, as a test of their basic analytical skills. Until late in my career, I was able to defend my turf.
One quirk of managers is that they tend to believe in their own perfection when it comes to writing, so editing them can be tricky. More than once, we budget drones would depart the office on a Friday, leaving behind what we thought was a completed budget ready for final approval, only to return on Monday to find that a manager had screwed around with it over the weekend and turned it in with serious omissions and errors that weren’t there before. A backup edit could have prevented that, but those are not always appreciated. One time I was able to delay, by about thirty minutes, sending through a piece that would have gone to Capitol Hill full of silly typos if I hadn’t caught them. My supervisor at that time was annoyed by the delay, and incredulous that there could have been any mistakes. I finally learned to edit on the sly if possible. I once rewrote a budget narrative that had come from one of our brilliant IT specialists in pure, incomprehensible geek-speak. With the help of Google, I was able to translate it into plain English. In order to get it through without a lot of review, I passed it off as the higher-up’s original work.
Later on, a newfangled electronic budgeting system was introduced, designed to make everything work faster and more efficiently. Like all new innovations, it did help in some ways when it was working properly (a fifty-fifty proposition), but at times it made matters worse, since some managers didn’t understand its limitations. They thought it gave them license to send in program narratives right on deadline, or make further changes at the last second, which could then be loaded into the system. They expected a fully realized budget to pop out just by clicking a button. But even the fanciest machines don’t necessarily understand formatting or recognize human errors. Naturally, we analysts were blamed for any mistakes we couldn’t catch on the fly.
In spite of frustrations like these, I took pride in my job until I apparently got too old for meaningful work. Hitting a certain age is the kiss of death for many Feds. Age discrimination is rampant in the Federal government, regardless of the rules against it. I’ve heard many stories similar to mine, so I have to conclude that agencies routinely drive out good employees who might have had several more years of productivity left. I can understand, up to a point, the need to plan for the future by bringing in younger blood. But the discarding process can be unnecessarily humiliating, and uneconomical as well. Sometimes I felt like shouting out that graying hair isn’t necessarily a sign of senility. I still remembered how to do things I had done as a youngster, and I usually noticed what was going on under my nose. I also questioned the wisdom of bringing in younger people and overpaying them to do the same work we used to do at much lower grade levels. I saw the most experienced employees relegated to the kinds of routine housekeeping tasks that are unappreciated and unrecognized until they don’t get done.
Since I retired, about a year and a half ago, I’ve heard informally that it is indeed a problem getting enough of these new hot-shots to pay attention to certain thankless but necessary tasks. I expressed the opinion before I left that it might be advisable to familiarize more people with the grunt work. But now that I’m gone, that’s so not my problem. My job now is to polish the skills I once used to earn a living, and have fun doing it.