Crowd-powered journalism becomes crucial when traditional media is unwilling or unable


Amid all the trolling and celebrity hoo-ha that takes place on Twitter (s twtr) and other social-media platforms, occasionally there are events that remind us just how transformative a real-time, crowdsourced information platform can be, and the violent response by local police to civil protests in Ferguson, Missouri on Wednesday is a great example. Just as the world was able to see the impact of riots in Tahrir Square in Egypt during the Arab Spring, or military action against civilians in Ukraine, so Twitter provided a gripping window into the events in Ferguson as they were occurring, like a citizen-powered version of CNN.

The unrest began after police shot and killed an unarmed black man, 18-year-old Michael Brown, in the middle of the afternoon, after what some reported was a scuffle of some kind. Mourners gathered, and so did those protesting what they saw as police racism, and there…

View original post 989 more words

Leave a comment

He Said, She Said

Are reporters and editors obligated to present both sides in any controversial news story?

You hear that question a lot these days.  Increasing political polarization is manifesting itself in demands by those on the left and right that articles or broadcasts about touchy issues, like abortion, gun laws, voter suppression and the like, always present “both sides.” Any omission of one side’s argument is prima facie evidence of bias.

Back in the old days — the 19th Century old days — offering balanced accounts was unheard of, and for good reason.  Newspapers were seldom more than opinion broadsides, and objective reporting was nonexistent. In the aftermath of World War I, reform-minded editors adopted a standard of fairness and balance in reporting (along with accuracy, of course), and this standard held sway through most of the 2oth Century.

These days, media critics on both the left and right claim a virtual right to having their point of view receive equal treatment in any article.  The media, always sensitive to charges of bias, is responding by writing “he said, she said” narratives that increasingly obscure the news in a cascade of charges and counter-charges.  The pressure is especially acute in the newsrooms of national newspapers like the Times and the Washington Post (the Murdoch-owned Wall Street Journal, where I once worked, has all but given up objectivity).  Any article about, say, the influence of Super PACs in the aftermath of the Citizens United decision are so blandly written, they actually distort the real news, which is, of course, how money (large quantities of money) are affecting elections.  Reporters can’t or won’t take a side in this highly controversial aspect of politics; the result is that Super PACs are often portrayed as civic-minded business benevolent assocations, no more influential than, say, your local Elks Lodge.

The sense of over-objectivity outrages many, who express their anger in blogs and posted comments.  But there doesn’t appear to be a way around the problem.  Urging supposedly unbiased newspapers to drop their objectivity would most likely exacerbate the already coarsened state of political and social discourse.  We probably don’t need, in other words, more gasoline on the flames.  But remaining safely on the altar of “balance,” poses its own risks if it drives away partisan readers (and subscribers) to bloggers and other content generators who wear their opinions on their sleeves. The advent of social media makes objectivity in reporting even more difficult; while editors struggle to fairly present both sides, consumer-generated content online has no such restriction.  In fact, a new cottage industry has sprung up, most visibly in the realm of political communications, specializing in creating canned counter-arguments intended not so much to balance the reporting, but to undermine or eviscerate the premise or the article.

By the way, in case you haven’t noticed, I’m written this blog with the sense of balance and fairness that I learned in news writing 101. I’m actually trying to present, fairly, all points of view.

Thirty years ago, the sentiments expressed here would have been welcomed by my editors and also, I suspect, most readers, for its tone of balanced objectivity.  Not anymore.  If I want to attract new readers and raise my profile, I guess I’d better start taking sides.

, , , , , , , , , , , , ,

1 Comment

A World Without Polls

How often have you heard someone ask what life was like before cell phones, or TVs, or the Internet? Here’s another one: what were politics like before public opinion polls?

Polling is ubiquitous these days, especially in political campaigns. It’s hard to imagine the news media and the chatterclass functioning without polls. And they give those of us who follow politics like a spectator sport with near instantaneous information on who’s up or down, in or out, and which issues have “traction,” and which won’t last a single news cycle.

Is that a good thing?

Well-designed polls from reliable organizations can present a reasonable snapshot of public opinion towards candidates and issues. Polls conducted by or for news media organizations also provide additional content for each day’s news cycle, as this example shows.  The New York Times’ political coverage is supplemented virtually every day with polling data from researcher Nate Silver.  His reports serve as a kind of daily racing form enabling political junkies to handicap election contests. Polls also clearly the power of immediacy. They can, for example, provide virtually feedback on a candidate’s debate performance; Gerald Ford’s startling assertion in 1976 that Poland was not under the thumb of the Soviet Union generated an immediate negative reaction in polling, and helped sink his candidacy. Newt Gingrich apparently won this year’s South Carolina GOP primary for challenging a CNN moderator about charges of marital strife, polls immediately showed. On the other hand, polls can be wrong, or misleading or — worse — manipulative. Much depends on the size of the polling sample as well as its demographic variety. This presents challenges to pollsters, especially those who rely on telephone sampling. Lots of people these days no longer have landline phone service, and mobile phone numbers are difficult to obtain. Ethnic and racial diverse audiences are typically under-represented, while seniors are over-represented. Independent research also has demonstrated that poll respondents often repeat inaccurate information they’ve read or heard and then cling to their misconceptions even when the correct information becomes available. “People recall facts that support their beliefs, and don’t recall facts that contradict beliefs,” says Leo Simonetta, a social psychologist and director of analytics for the Art & Science Group, a Baltimore-based education consultancy.

Polling, no matter how well it’s done, replaces the visceral nature of campaigns with detached “ojectivity.” It gathers and collates the opinions of people — thousands of them — and yet most polls seem strangely impersonal, with thoughtfulness, emotion, hesitation and insight scrubbed out of the data. It’s much harder as a result to find detailed candidate profiles, of the kind that Joe McGinniss or Tom Wicker used to write “from the back of the (campaign) bus.” When I covered politics many years ago, my editors insisted that I spend time driving and flying around with the candidates, and write about what I saw, heard and felt. It wasn’t important to predict who would win.  Rather, my job was to give readers an insight into the candidates, and help them answer the question: would you have this guy over for dinner?

In any event, public opinion polling here to stay. It does makes you wonder, though, what it must have been like back in the days when, without polling to foreshadow the outcome, people didn’t have a clue who was going to win on election day. Not knowing was a lot more fun.  Which is not a bad thing at all.

, , , , , , , , , ,

Leave a comment

Journalism’s Image Problem

You know those TV ads in which a new car buyer is swept into a room full of reporters, photographers and video cameras, and then asked a couple of softball questions, like “how much do you like your new Fusion?” In my mind, these ads parody the actual news gathering process of daily journalism, turning it into a marketing gimmick — and all with a wink-wink of the advertiser’s eye.

Exaggerate much? Well, not if you think of journalism these days as sinking fast from the nation’s consciousness. The car ads are harmless in the sense that they were not created to disparage reporters. They are about selling cars. But in their way, the ads demean professional journalism, edited to cause the casual, uncritical viewer (of which there are probably millions) to think that the surprise interview was a real event, and that the people asking the questions are real reporters. Even if viewers can see through the ad, they nevertheless are absorbing the visual context of a news conference, but in the context of shilling for a car maker they are ostensibly covering.

The ad campaign is but one of many ways in which professional journalism is being denigrated in contemporary society. It’s never enjoyed public support throughout history. But to most citizens, an active, aggressive and accurate news media serves the legitimate and necessary role of the attentive bulldog (or school marm) — a counterforce to unbridled government, pompous politicians, unscrupulous businessmen and the like. After Watergate in the mid-70’s, this watchdog role was especially valued.  Enrollment at journalism schools skyrocketed; reporters were heroes.  Now that exalted role is diminishing, in part because the upheaval in the economics of publishing and broadcasting is decimating the ranks of reporters and editors with jobs, thus reducing coverage, and in part because of the rapid rise and pervasive influence of online, digital media, where journalists are just another content provider. Less noticed is the way in which the image and reputation of traditional journalism is being trashed, with surprisingly little pushback from reporters, editors, journalism professors — and viewers and readers.

What’s driving this demise?  One theory portrays journalism as essentially irrelevant in today’s information age. Anyone with a camera phone and email can gather and report the news; no one needs training or professional experience to tweet or post a video on You Tube. Reporters today compete with online social media sites, “citizen journalists” and a global cadre of eyewitnesses who are documenting newsworthy events in real time. If there’s little editing or fact-checking of the raw information — an essential function of professional journalism — it seems to many a small price to pay for immediacy, “inside” information and gossip, and blog posts of the rich and famous.

Journalism also is under assault for another, less benign, reason. In this reckoning, journalism, with its claims of “ojectivity” and professionalism, has actually impeded “progress” by its constant criticism and incessant second-guessing of American institutions and cherished beliefs — Congress, the Presidency, Wall Street, separation of church and state, and so on. Acting out a shoot-the-messenger obsession, journalism is seen as an obstacle — a tumor on the body politic — that needs excising or at least containment. The most vocal critics argue that journalism’s civic ombudsman’s role is a self-appointed one that is no longer needed in today’s society. Their work product — news articles and broadcast segments — is scorned by critics who claim that since “objectivity” is impossible, professional news gathering and reporting is just as biased in its own way as advertising or political rhetoric and therefore in no need of either protection or a special role in society.

The belittlement of journalism is on display as the 2012 Presidential campaign gets underway. During the recent Iowa GOP Presidential caucus, evangelical ministers (hardly unbiased observers) served as supposedly impartial interviewers for Michele Bachmann, Rick Santorum and Rick Perry, while mainstream reporters were kept away. Newt Gingrich’s prowess in the Presidential debates, you will recall, was based primarily on his sarcastic belittling of the questions of the reporters and moderators. None of the other candidates came to their defense. Mitt Romney’s disdain for reporters is palpable, while President Obama’s well-oiled media machine treats news professionals like cast members in a long-running political drama.

There are murmurs of discontent about all this in some journalistic quarters, but it is strangely muted.  To some degree, reporters are used to pariah status, and the criticism rolls off their back.  But things are different now; newsrooms are emptying at an alarming rate, and papers literally are shrinking, leaving even less space for extended reporting. Overseas bureaus have closed, further isolating Americans from the rest of the world, and the audiences for TV news programming, never robust, is in glide path decline. On the rise is everything the professional reporter despises:  puffoonery, bombast, reckless opinion, propoganda, spin and inaccuracy.  There’s more content available than ever before, but where it comes from, who is responsible for it, and whether it is even factual are all matters increasingly difficult to ascertain.

If there’s anything to be optimistic about, it would be the hope that journalism is subject to the same shifts in fortune as any other segment of society.  It just might be that a new generation of idealistic men and women will opt for a journalism career over law, medicine or business as a way to keep our nation on its toes. The form may change from paper to digital, but trained, dedicated news reporters will have an opportunity, especially online, for immediate, broadscale impact. The key is finding and nurturing those souls who feel they can make a difference, and in so doing be a force of good in the world.  That was the motivation for generations of reporters and editors back in the day.  They didn’t always measure up to their lofty ideals, but for those who adhered to the basic formula of  accuracy, fairness, balance, objectivity, and most of all, a regard for facts, their influence was immeasurable.  We could use more of that today, couldn’t we.

, , , ,

Leave a comment

Separating Fact from Everything Else

I have signed on as a volunteer for the News Literacy Project, which is an attempt by journalists to help middle and high school students separate facts from fiction in what purports to be news these days. It’s a great idea, because much of what we read as “news” is more and more canned content designed to promote or defend a point of view. With more people than ever obtaining their news online, it’s critical that readers understand where the stuff is coming from, and who is behind it.

I know something about this subject. After spending a decade as a print journalist, I passed over to the “dark side” and became a public relations professional, first as the spokesperson in a bruising U.S. Senate campaign in North Carolina, then for a major food retailer.  I also operate my own PR consulting company (where this blog originates).  Along the way, I spent a year teaching undergraduates at Miami University of Ohio, where I created my own course on the symbiotic relationship between the news media and public relations.

I don’t feel a sense of urgency on the part of the project organizers, but there should be. A functioning democracy relies upon an informed electorate.  John Q. Citizen ought to be able to find reliable, accurate, reasonably objective news and track who wrote and published the information.  That should be the baseline.  These days, however, fact, fiction, spin and propaganda have found a home online, and it’s difficult if not impossible for all but the well informed to know the difference.  Fortunately, at least for now, the public is skeptical. According to a new report from the University of Southern California, a sizable majority people don’t trust the reliability of the information they access online. This is especially true of social media content, the study says.

Yet the USC report found something else that really should concern all of us, and makes the NLP even more important.  Jeff Cole, author of the study and director of USC Annenberg School’s Center for the Digital Future, said Americans tend to be more trusting of government and big media (emphasis added, for emphasis).

“Other countries are better at distinguishing good information from (the) unreliable,” he said. In repressive regimes where media is closely tied to the government, citizens grow adept at filtering truth from propaganda.

Our government is not considered repressive (at least outside the Tea Party), but the ties between government and the major news media of this country is certainly a symbiotic one (there’s that word again), and a cause for growing alarm.  As citizens, we must understand the nature of this relationship, and raise our voices when we come across examples of spin or propaganda posing as news.  The first step in doing that is grasping the difference.

, , , , ,

Leave a comment

Is Journalism Giving Away its Brand?

Daily newspapers, these days, find themselves between a rock and a hard place. The dilemma has to do with how should they gather the news and then where to put it:  in the traditional (printed) newpaper, or online?

Up to now, most publishers have made a Solomon-like choice to do both.  Reporters cover stories and events, but what’s written goes online first on the paper’s website.  A fuller version, presumably, appears in the next day’s print edition. This is the established procedure at Gannett papers and probably most other news organizations, but there are growing indications that this process is neither profitable nor sustainable. Increasingly, print paper circulation is declining, and with it, ad revenue.  Newsrooms have suffered widespread staff reductions, leaving many newspapers reliant upon wire copy — or simply not covering much in the way of news. Online articles are shorter, deficient in background or perspective, and difficult to find underneath the pop-up ads that now dominate publishing company websites.

The debate over going digital, however, masks a much more fundamental problem for journalism.  Content can come from anywhere, not just — or not even — the newsroom.  News, feature items, commentary, restaurant reviews and nearly everything else that was once the purview of experienced, seasoned reporters and editors is now generated increasingly by, well, anyone. Consumer-generated content appears everywhere, as well:  on special interest websites, in the blogosphere, on Twitter, Facebook and dozens of other social media channels that promote (and profit from) citizen “conversation.”

Not surprisingly, the journalism establishment is struggling to cope, but the proverbial horse is clearly way out of the barn.  One of the more prominent bearers of bad tidings is John Paton, who runs NewsMedia Group, the nation’s second largest publishing empire.  Paton, recently featured in the New York Times, is steering his company away from daily publishing and into digital; eventually, he envisions, what we call journalism will be divvied up this way: a third of the news will be local content produced by professional journalists, a third will come from readers and community input, and a third will be aggregated from other news gathering sources, blogs, etc. Another industry observer, blogger Mathew Ingram, touches upon the same point:

“Newspapers as a distribution system just aren’t equipped to handle news as a process; (Ingram writes) printing a single version of a news event with no links and no updates (until at least the following day) fundamentally doesn’t make sense in today’s news environment. Looking at the news from a blogger’s point of view — as an amalgamation of Twitter and Storify and video and photos, with comments and updates and links — makes a lot more sense, but it doesn’t translate well into a print-focused culture.”

The transition from print to digital, and the corresponding increase in non-professional content generation, may make “news” more instantly available to the millions who surf the Internet.  But what’s being lost is credibility, and in this regard, newspapers have been slow to react.  The material consumers access online today is a mish-mash of truth, fiction, market-driven manipulation, rumor and propoganda.  Much of it is anonymous, and a significant portion of what is conveyed as fact is in truth little more than agenda-driven screed. In a very real sense, journalism is giving away its “brand” — reliable, objective information.

It doesn’t have to be this way.  There is still a sizable audience in cyperspace eager for credible reporting and in-depth analysis, and apparently, it’s an audience that doesn’t mind paying for it. What’s desperately needed is a renewed commitment to excellence by publishers, editors and reporters. For journalism’s survival, it will have to commit to quality standards for the content it publishes, or else see its business fade into digital irrelevance. Success will come to those publishers and news organizations who emphasize the reliability, credibility and perspective of their work product, whether that product appears on the printed page, a website, a blog or a tweet.  Signed articles, sidebars that describe how the article was put together, and reader-reporter dialogues are just a few ways that news organizations can rebuild their market share, by making believability their brand.

In the digital era, in others words, tradition can be a good thing.

, , , , , ,

1 Comment

History Gets a Drubbing on the History Channel

This blog is about how media impacts culture and society.  Television continues to exert an oversized influence on American life, and so its content — what it shows viewers — remains important, even as mobile and tablet technology become the media platforms of choice.

So, along with millions of Americans who possess an enduring interest in anything related to the Civil War,  I tuned in on Memorial Day evening to the History Channel’s highly ballyhooed, two hour docudrama on the iconic Battle of Gettysburg.  I wanted to see how the eponymous cable channel would treat this memorable military clash, especially against the backdrop of the meaning of the Civil War to contemporary audiences. I should have gone to bed.

The broadcast, produced by famous move director brothers Ridley and Tony Scott, was terribly disappointing.  Others — with far more knowledge of the details of Civil War uniforms, flags, formations and such — pretty much agree, and that was before they actually watched the show.  For myself, a rank amateur historian, the show overlooked or misinterpreted any number of points about the battle that needed to be told accurately and with perspective. The result was bad television.

The Wheatfield at Gettysburg, Viewed from Little Round Top

To begin with, the production very definitely was not taped at Gettysburg, or anywhere near Pennsylvania.  The Czech Republic or Belarus seem more likely.  (One blogger asserts that South Africa was the location). The cast — actors playing soldiers fighting and dying — was another red flag as to this production’s provenance.  I have the sneaking suspicion that most of the cast were eastern Europeans. How else to explain the fact that not one actor in the entire drama had a speaking part, other than a few of the Johnny Rebs who were required to perform a rendition of the supposedly fearful rebel yell.  As these actors stumbled their way across what was meant to be Gettysburg’s hallowed Wheat Field, they sounded like farmers flushing out wild boar or turkey.  Or something.

The hardy cast members, it should be noted, kept reappearing scene after scene.  Undoubtedly to save money, pivotal scenes were shot simultaneously by two or three cameras, and then repeatedly spliced together to make it seem that we viewers were seeing the full panoply of mortal combat.  But after viewing the same scene over and over of Confederate troopers taking aim at fleeing Union troops in the streets of Gettysburg on “Day One, I stopped counting the repetitions. Pickett’s famous “charge” across the Pennsylvania countryside was laughably ineffectual.  Rather than the 12,500 rebels who actually crossed to Cemetery Ridge, the charge in this show was rendered as a meager platoon of Confederates who kept being blown up by Union artillery, only to magically appear again, to be blown up several more times.  Some of the actors were killed at least a half-dozen times, yet later, you could pick them out in scenes of arm-to-arm fighting hours after they were killed off in the script. I’ve seen more realistic action on my son’s video games.

These are, except to Civil War buffs, minor, laughable blemishes.  But other parts of Gettysburg were downright embarrassing.  One example:  early on in the broadcast, a black woman was shown making her way by night through some woods.  The narrator used this scene (and a goofy map of the Underground Railroad) to summarize the entire issue of slavery, which was mentioned almost as an afterthought as a cause of the Civil War.  Bad enough that the escaping slave was identified as an “African American” — a contermporary term that is completely wrong for slaves in 1863.  Worse was the suggestion that Gettysburg was fought where it was because the town was an Underground Railroad stop. Lee’s army was in Pennsylvania to change the war’s increasingly static dynamic by taking the battle (and the carnage) to the Northern heartland. Gettysburg just happened to get in the way.

Making matters worse was the constant refrain by several historian talking heads that the fate of the United States hung in the balance at Gettysburg.  The three-day conflict was indeed a huge, bloody battle, rightfully honored in history.  But it was not the deciding battle of the war.  Most historians agree that Antietam, fought a year earlier not far away in western Maryland, was the far more crucial (and equally as bloody) battle because it ended Confederate hopes for recognition as a soverign nation by England and France. Besides, Antietam (or Sharpsburg) was largely a draw, with carnage aplenty, while Gettysburg is the better known because it contained not just a winner and a loser, but also all the elements needed — like Pickett’s Charge — for myth-making.

Little Round Top from Seminary Ridge (sketch by the author)

Worse, nowhere in the script was there a mention of the huge gamble of Lee’s thrust into Pennsylvania, or of the rift between the various Confederate commanders at Gettysburg.  James Longstreet, who bitterly opposed Lee’s strategy and went out of his way to delay Lee’s battle plans, was never mentioned — an astounding omission — equalled by the complete lack of any explanation of the folly of Pickett’s famous charge.  (Pickett survived the battle, and spent the rest of his life berating General Robert E. Lee for sending so many men to their doom). And the cause for which they died?  Better skip over that rather awkward reality.

But by far the most egregious historical error of History Channel’s Gettysburg was its almost complete — and intentional — disregard of the essential roles of commanding officers on the scene.  The afore mentioned Longstreet. Winfield Scott Hancock, whose fearless leadership held the Union lines strong.  John Bell Hood, a bold Texan short on strategic sense (but long on courage) who drove his men to certain death attempting to scale Little Round Top. Or G.K. Warren, a brilliant Army engineer whose knowledge of topography saved the Union’s left flank from destruction.  The HC’s marketing boasted that the docudrama was intended to promote the rank and file rather than the officer corps, proclaiming: “At its core, this is the story of the soldiers on the ground, not the generals who commanded from behind the frontlines.” Well and good, except that the facts argue otherwise.  More than a dozen generals died in the battle, while others, like Hancock and Sickles, were seriously wounded.  If you set out to recount a major moment in history, as the History Channel claims it was doing, you should at least get the details right.

Does any of this make any difference?  Perhaps not, as long as Americans are willing to accept an idealized and bowlderized history of their nation. I think it does make a difference, however.  Facts may be open to debate, but they are not fungible.  History is at once memory, tradition, cultural celebration (or shame), but all of it is premised on accepted agreement about what actually happened. This production falls far short of that standard by conflating fact with image and bloodshed with nobility.

Let’s hope the current and coming generations of eighth graders are not required to view this debacle of an historical video.  In fairness, the History Channel rebounded slightly with its second Civil War broadcast on Generals Lee and Grant.  At least this presentation appeared to get most of the salient facts right.

Students could benefit from watching Lee and Grant.  But anyone wanting to know about the life of Civil War soldiers should read “Faces of the Civil War, An Album of Union Soldiers and Their Stories.” Another book should be required reading:  David Blight’s masterly “Race and Reunion, the Civil War in American Memory.”

You can purchase both books in the gift shop at Gettysburg.

, , , , ,

Leave a comment