Memorial Day is not my holiday; it’s theirs

Memorial Day

This weekend, amid the smells of barbecues and fresh flowers at gravesites, and the sounds of children playing and new flags snapping in the breeze, my thoughts have been with two men for whom Memorial Day holds other meaning: my father and father-in-law.

My dad was a Depression-era child who came of military age as tension mounted in Korea and would have missed war entirely had he gone to college instead of the Navy after high school. So when most of the young men he knew in school were just learning to shave, he was learning how to keep his clothes dry while bunking on the damp anchor-chain deck aboard an aircraft carrier plying the Pacific.

He chose the military because he had no money for college. And he opted for the Navy because a favorite uncle served in that branch. The same uncle had jumped off a sinking carrier into burning oil during the Battle of the Coral Sea in 1942, and my dad remembered seeing the scars across his arms and back from that and thought of him as a true hero.

My dad did nothing so risky during his service, but his contribution was no less important. He parlayed an interest in photography into a post with Naval intelligence, helping to map out battle plans. He served on two carriers during a duty spanning the end of the Korean conflict and the return to peacetime. Although he never picked up a gun, his work in the dark recesses of the carriers disseminating classified information was weapon enough. Even now, more than 60 years later, he refuses to discuss what he worked on down there.

My father-in-law, Gene, on the other hand took his life into his hands nearly each day he set out from port. A dozen years older than my dad, he was among what Tom Brokaw called “the Greatest Generation,” and his duty took place aboard the cramped, creaky decks of Liberty ships sailing to stock American troops and their allies. While with the Merchant Marine, Gene sailed both the Atlantic and Pacific, crossed back and forth through the Panama Canal and saw more of the world than an Illinois farm boy ever expected.

He does not speak of his service; I had to pry stories out of him. And what I heard amounted to fascinating and frightening tales. He recalled the days he crawled to his post, hand over hand, as storms crashed his ship and three-story waves loomed over the deck like granite cliffs, and the nights when he saw flashes of fire through the inky night as ships on the fringes of the convoy were torpedoed and sunk. More often, he rode the center of the convoy, on ships loaded with armaments. Gene said he had to force out of his mind thoughts of what might happen if a torpedo hit one of those.

And so while many Americans everywhere have enjoyed a three-day weekend and the unofficial start of summer, I content myself with the images and stories passed along from my father and father-in-law.

Instead of barbecue, I recall from my childhood the musty smells of yellowing yearbooks embossed with the carriers my father sailed on and filled with photos of 3,000 or so of his colleagues. Instead of enjoying the pageantry of parades, I prefer sifting the dusty snapshots of my father-in-law in his Merchant Marine uniform, so large it seemed to hang on his small frame.

They would prefer I go out and enjoy this holiday weekend. But it’s not really my weekend. It’s theirs.

Kennedy story belongs to me now

President John F. KennedyI was nowhere when President Kennedy was shot.

“Nowhere” being a relative term.

My mother was in bed and seven months’ pregnant with me when she first heard about the assassination. She was reading “Time” magazine — it came in the mail that morning — and was staying off her feet on doctor’s orders when her best friend called.

The day was crisp and pleasant beneath a cloudless sky. Sunlight streamed past the open curtains to warm the room.

“Are you all right?” the friend asked. No “hello” first.

“I’m fine,” my mother answered. “Why?”

“Turn on your TV, now. They’re reporting that President Kennedy was killed.”

At that, my mother hauled herself out of bed and waddled to the living room, turned on Walter Cronkite’s report and sat stunned for hours, like the rest of the nation, watching the tragedy unfold. Right then, as with many Americans, her day went dark despite the sun.

At some point, a nurse from her doctor’s office called to ask the same question her friend had. Half a century ago, medicine was about care, not insurance.

My mother replied she was fine, to which the nurse said she should plan on coming in for a check-up anyway “to be sure.”

“Well, the doctor will be in tomorrow, regardless,” the nurse said.

My father, meanwhile, was at work and having lunch when he heard. He was sitting on a bench outside; the weather was too nice to ignore. He had just opened his lunchbox and started removing the wax paper around his sandwich. He carried a little portable radio in the lunchbox, too; it always came out before the sandwich did.

He was about to take the first bite when the announcement was made. He sat and listened maybe five minutes, then re-wrapped the sandwich, closed the lunchbox and returned home. He did not stop inside to tell his boss. The radio remained on the whole time.

My mother repeated this story to me every Nov. 22, once I was old enough to understand it. She did that because she measured her life against world events — mention most any of them that transpired within the span of her life and she knew what she was doing at that moment.

This year is the first that I recall her story of Nov. 22 without prompting. She died in August.

My father, who has trouble with recollection due to failing health, does not remember that day. He sometimes does not even remember my name.

So, the story belongs to me now.

Please, please, PLEASE, think before you tweet

Think before you tweet

context (n.) — the portions of written or spoken statements that influence meaning or effect.

Philadelphia TV reporter and former anchor Joyce Evans may finally appreciate the meaning of this word, thanks to social media.

Kansas University journalism professor David Guth might as well, for the same reason.

Both have entered a pantheon of infamy wrought by ill-advised actions on Twitter, considered the fastest vehicle for embarrassment apart from reality TV. They are poster children for the importance of cramming context into the small space Twitter allows, no matter how tight the fit.

The question now is whether anyone who witnessed what they went through garners a shred of wisdom from the circumstances.

Evans ran headlong into a wave of unwanted attention this week after merging pop culture and breaking news into one cumbersome, 89-character blurt on Twitter for her employer, Fox affiliate WTXF-TV.

Evans' Tweet

Evans’ intent was clear; she wanted to surf the wave of attention spawned by broad public interest in “Breaking Bad,” the black-comedy crime drama on AMC that bowed out Sept. 29 after 62 episodes and a history of far-reaching social engagement.

But in channeling “Bad” the way she did, Evans trampled the distinction between reality and fantasy, and suggested she was deaf to the tone of each. Audiences tried to enlighten her.

Evans Criticism

An apology for her overstatement seemed in order. Instead, Evans compounded the problem by pushing off responsibility onto her Twitter followers.

Evans' Response

The subsequent fusillade stretched well beyond WTXF’s viewing area, silenced Evans’ usually busy Twitter feed as well as her Facebook page, and cost her the weekend anchor job she held since 1996.

Guth’s own Twitter reality check in mid-September, on the other hand, was purposeful and potentially more costly. The associate professor at the William Allen White School of Journalism and Communications exploded against conservative commentary on the shootings at the Washington Navy Yard on Sept. 16. Thirteen people died, including the assailant.

In response to perceived invective on Twitter by alleged supporters of the National Rifle Association, Guth posted:

Guth's Tweet

The reaction was predictable. Even Republican state lawmakers vowed retaliation, and the president of the Kansas State Rifle Association promised that her NRA chapter would campaign to have Guth fired.

KU at first distanced itself from Guth’s comments, then from Guth. The university hustled him off on a research sabbatical that was not scheduled to start until next year. His Twitter feed also came down.

Guth remains unapologetic. He said on TV after the tweet that he was “deliberately provocative,” and in an email responding to my request for comment, he wrote, “It’s unfortunate that my comments have been deliberately distorted. I know what I meant. Unfortunately, this is a topic that generates more heat than light.”

He said he expects to be back at KU at the conclusion of his sabbatical but declines to say anything more about what happened. The university is similarly silent.

As for what the rest of us expect, especially from professional journalists and educators, it’s something more than selfishness, something more than a middle finger pointed at our sensibilities.

When Evans hyper-extended her comparison, she made what many of us might consider an honest mistake. The lure of social media is in part due to its speed and the excitement that speed generates. In turn, we react without full awareness of what we’re saying and remain ignorant until the excitement subsides.

A 2009 study by the University of Southern California seems to confirm this, explaining that social media moves too fast for our “moral compass” to catch up with what we’re thinking.

“If things are happening too fast, you may not ever fully experience emotions about other people’s psychological states and that would have implications for your morality,” Mary Helen Immordino-Yang, a researcher for the study, told CNN. “For some kinds of thought, especially moral decision-making about other people’s social and psychological situations, we need to allow for adequate time and reflection.”

Sree Sreenivasan agrees. He’s a popular tech evangelist and one of the foremost advocates for sensible use of social media. At the Society of Professional Journalists’ national convention in Fort Lauderdale last year, he advised journalists against posting before thinking.

The owner of more than 50,000 Twitter followers, Sreenivasan waits three to six minutes between tapping a tweet and posting it because he knows that first words usually are not the best words, in any medium.

“Anything you share can and will be used against you,” he said.

This is sound and potentially career-saving advice for people such as Joyce Evans and David Guth who put hubris before introspection. In both instances, the Twitterers omitted context, either by accident or by design, then denied that their choice of words muddled their messages.

You are the best protector against your own embarrassment and ridicule. We need to remember that in this social-media inflected age, often our only guide to responsible behavior is staring back at us in the mirror.

Maybe Evans would still be a TV anchor and Guth still teaching if not for their unartful language. Unfortunately for all of us, their fame is based on what they said, not what they meant.

(Update: Guth will be allowed to teach again at Kansas next fall, the Lawrence Journal-World reports.)

Why we celebrate July 4, instead of July 2

July 4th IconEvery year, Americans set aside July 4 to wave flags, march in parades, shoot fireworks and cook meat, all ostensibly to celebrate the collective rancor of a few men in frocks and wigs deciding that we were finished being British.

The timing is due to documentation. Atop the Declaration of Independence are the large words, “In Congress. July 4, 1776. The unanimous Declaration of the thirteen united States of America.” As if that were the date this deed was done.

In fact, it wasn’t, according to many historians. For the sake of accuracy, they say, we should break out the party favors and barbecue sauce two days earlier.

Why? Because the document itself is not the declaration but an announcement ― a press release, if you will ― of the declaration made July 2, when the Second Continental Congress voted to approve the Lee Resolution, a proposal for independence from the British Empire advanced in June by Richard Henry Lee, a Virginia statesman.

A month of arguing in Congress followed Lee’s proposition. Some among the 56 delegates thought it too soft. Some, however, argued for immediate reconciliation with Great Britain to minimize the likely economic and social punishments expected from Parliament for colonials being petulant enough to fire guns at the king’s soldiers. Whole colonies were ready to bolt the alliance at the mere prospect of independence.

On July 2, 1776, however, the last reluctant colony, South Carolina, agreed to go along with the declaration. (New York abstained, as it awaited permission from the colony’s legislature to review and approve the declaration ― approval it received a week later.) On that day, the Second Continental Congress voted in favor of the resolution for independence.

Over the next day, the delegates haggled over remaining details in the resolution’s wording. On July 4, author Thomas Jefferson presented the revised wording in a final copy, which was approved without reservations.

But the debate over our independence date doesn’t end there for some historians. Because although the delegates agreed to independence on July 2, and ratified on July 4 the document announcing it, the signing ceremony, as it were, occurred on Aug. 2, and not all delegates signed then either. Only John Hancock, the Massachusetts delegate who presided over the Congress and whose signature is the largest, is presumed to have signed on July 4.

So, why do we celebrate our independence on July 4, instead of July 2?

Paperwork.