MAKING A DIFFERENCE:
Covering Campaign '96
A report on the Poynter Election Project
by Philip Meyer and Deborah Potter
"After each election, journalists swear off the bottle. No more silly scandals or mindless horse-race stories. No more photo ops and thirty-second smears. Never again will we allow ourselves to be manipulated by campaign consultants. This time we will stick to substantive issues whether the candidates want to talk about them or not. And then, like struggling alcoholics, we take one drink and fall off the wagon." -- Howard Kurtz, Media Circus
After spending several years encouraging journalists to change the way they cover campaigns and elections, The Poynter Institute decided to study what news organizations did in 1996. Our goal was to see if those who promised reform actually delivered, and to find out if citizens were any better served.
OVERVIEW:
Our study looked at print and broadcast coverage in 20 markets across the country (see chart A). We surveyed journalists by mail and citizens by telephone, both before and after the election. We also collected coverage on randomly chosen dates in the final seven weeks of the campaign. Every story that mentioned a presidential or senatorial candidate and the election was coded for analysis. We then grouped news organizations based on their coverage plans, and looked to see what difference that made in their newsrooms, their content and in their communities.
We used the newsroom survey to determine the basic approach each planned to take in covering the 1996 elections. Seven of the questions were so closely-related that they defined a single concept, which we chose to call "citizen-based journalism" (CBJ). The seven items:
- Sponsor one or more public forums on issues
- Use polls to establish the issues your coverage will focus on
- Conduct focus groups with voters to establish their concerns
- Form citizen panels to consult at different stages of the campaign
- Seek questions from readers/viewers for use when interviewing candidates
- Base reporting largely on issues developed through citizen contact
- Provide information to help citizens get involved in the political process in ways other than voting.
Newspapers were much more intent on practicing citizen-based journalism than were TV stations. And there was much more variation among the newspapers, with some high and some low in intent. The broadcasters were more uniformly low. (See charts A, B and C)
Because of the more interesting diversity of approaches in the newspapers and because our analysis of TV content is not yet completed, the rest of this report deals only with newspapers.
RESULTS:
In counties where newspapers had a high intent to practice citizen-based journalism (high-CBJ), citizens had significantly:
- More knowledge about candidate stands on issues
- More trust in the media
In counties where newspapers had a high intent to practice citizen-based journalism (high-CBJ), there are hints that citizens had:
- More social capital
We found no significant connection between intent to practice citizen-based journalism and:
- Trust in government
- Political participation
Newspapers with a high intent to practice citizen-based journalism (high-CBJ papers) had content that was noticeably different:
- More stories mainly devoted to explanation of specific policy issues
- Fewer stories containing any mention of horse-race polls
Journalists at papers with with a high intent to practice citizen-based journalism (high-CBJ papers) handled and assessed their coverage differently:
- More advance planning and discussion
- Greater satisfaction with their coverage
Each of these findings is explained in detail below.
1. Political knowledge
Citizens in the 10 counties that were high in intent to do citizen-based journalism learned more in the course of the campaign than citizens in the low-CBJ counties.
We asked three questions about candidate issue positions to determine political knowledge in the communities we studied. About 20 percent of our sample knew the right answers in August, in both high-CBJ and low-CBJ counties, but in November, significantly more citizens in high-CBJ counties got the answers right.
GROUP AUGUST NOVEMBER Low CBJ 20
24
High CBJ 21
31
The difference is even more striking among those people in the November sample who had not taken part in the earlier survey. We asked them the same three plus two new questions about candidate positions. Almost twice as many people in the high-CBJ counties got the right answers.
GROUP NOVEMBER Low CBJ 11
High CBJ 21
Our survey therefore supports those editors who believe that their techniques will improve voter knowledge. However, it contradicts a cherished belief by some of them that horse-race polls distract voters from the substance of the election and keep them from learning about the issues.
We tested this belief by comparing the issue knowledge percentage in November of people whose awareness of polls was high with those who had not followed the polls in August.
POLLS ISSUES Unaware 15
Highly aware 40
More than twice as many of the poll-savvy people knew where the candidates stood on the issues. Of course, they might have known more to begin with. But even after the effects of age, education and prior knowledge were accounted for, attention to the polls explained a significant part of the difference in knowledge in November. For each 1-point increase in poll knowledge in August, issue knowledge in November climbed by 0.13 points. Polls, by providing context for and arousing interest in the campaign, appear to help voters focus on the candidate issue positions.
2. Trust in media
Citizens in high-CBJ communities were less mistrustful of the media.
Well over half of the people in our 20-city sample are cynical toward their news media. In November, 62 percent said the news media are run by a few big interests looking out for themselves, and 58 percent said the media deserve a lot of blame for the way the political process works. But significantly more of the most cynical citizens in low-CBJ counties were mistrustful of the media than those in high-CBJ counties.
GROUP MEDIA MISTRUST Low CBJ 50
High CBJ 38
Even after the effects of party, age, race, and education have been filtered out, media bashing declines as citizen-based journalism increases. Media mistrust declined by a tenth of a point on the 2-point scale for each 1-point increase in CBJ intent.
3. Social capital
Citizens in high-CBJ counties had the most trust in others -- social capital -- in August and November.
In November, 52 percent of the people we surveyed said most people can be trusted; 65 percent said most people try to be helpful; and 68 percent said most people try to be fair.
We combined these answers into a measurement of social capital, and looked to see if it changed during the campaign, comparing the percentage of citizens in each subgroup who gave the most trusting answers.
GROUP AUGUST NOVEMBER Low CBJ 32
39
High CBJ 44
54
These numbers look significant, but we found most of the difference between high and low-CBJ counties was due to other factors. A re-interview effect meant that people who thought about the questions between August and November expressed more trust the second time around. Demographic variations also played a role: we found effects from education (better educated people are more trusting); age (older folks trust more); and race (non-whites are less trusting).
In November, after the election, with the same demographics held constant, the effect is just barely significant. Our tentative interpretation is that the effect of CBJ on social capital is potentially important but too tenuous to measure with confidence in a study of this scope. We think this finding deserves further investigation.
4. Trust in government
Citizens in high-CBJ counties were less cynical about politics, but only slightly, and only in November.
At least two-thirds of the people in our 20-city sample are cynical about politics. In November, 66 percent said the government is run by a few big interests looking out for themselves, and 72 percent said Washington can never or rarely be trusted to do what is right.
Non-whites, older citizens and the less educated were the most cynical when the August poll was taken. These groups were still the most cynical at the end of the campaign, and they had been joined in their dour outlook by a fourth faction: Republicans. Forty-two percent of the self-described strong Republicans were in the high-cynicism group after the election, compared to 36 percent before.
5. Political participation
Citizens in high-CBJ counties were no more or less politically active than citizens in low-CBJ counties.
In theory, citizen-based journalism should increase the rate of voter turnout and other kinds of political participation. We found no such effects.
Participation was measured by asking citizens how closely they followed the presidential campaign, whether they tried to convince other people to vote a certain way, whether they attended political meetings, had political discussions with friends, and whether they themselves voted. None of these had any visible relationship to the intent of their local media to practice citizen-based journalism.
Thus all of the effects found thus far have been inside the citizens' heads. CBJ has moved their attitudes but not their feet. In a more-closely contested Presidential election, this outcome might have been different.
6. Newspaper Content
High-CBJ papers had more stories that were mainly devoted to explanation of specific policy issues. And they had fewer stories containing any mention of horse-race polls.
The movement to change the way the media cover national elections was, for newspapers at least, a real, behavior-modifying event and not just an empty resolution. In those two ways, the difference was visible.
We did not, however, find that these newspapers spent any less time on conflict-oriented stories or on the strategic and tactical maneuverings of politicians. While our 20 counties did vary on these dimensions, the variation was not related to their intent to practice citizen-based journalism.
The percent of stories that were mainly about policy issues ranged from a high of 19 (Portland OR) to a low of 3.5 (Little Rock). After the usual demographics were held constant, issue content had a small but significant effect on trust. The more issue stories, the more citizens trusted the newspaper.
It seems likely to us that this effect is indirect; that papers with more issue stories are doing other things that relate to trust, and the issue count is only an intermediate variable. If a direct effect exists, we should find issue content increasing citizens' issue knowledge. But we don't.
When the effect of what the citizens already knew in August is controlled, their November knowledge is unaffected by the amount of issue content that editors put in the paper during the intervening period. The effect is not just insignificant; it is about as close to zero as it is possible to get. In sum, election content as we measured it, has very little to do with the effects that were so clear when we connected media intent to citizen attitudes. We'll try to explain why in the summary below.
7. Newsroom assessments
Journalists at high-CBJ papers spent more time planning and discussing their coverage, and were more satisfied with the outcome.
After the election, we asked journalists about the advance planning that went into their coverage. On a scale of 1 to 4, where 4 was "a lot" and 3 was "some," the high-CBJ newsrooms averaged 3.83, while the low markets were at 3.46.
In high-CBJ newsrooms, journalists gave themselves slightly higher marks for their campaign coverage: an average of 4 on a scale of 1 to 5, compared to 3.8 from journalists in low-CBJ newsrooms. But when we asked them what they would have added to make their best campaign story an "ideal" story, few journalists in high-CBJ newsrooms mentioned any elements that would have improved their favorite story. In low-CBJ newsrooms, one-third of the journalists (33 percent) said "citizen voices" would have improved their favorite story.
SUMMARY AND CONCLUSIONS
What ends up inside a citizen's head after an election campaign is related to what the media serving that citizen are trying to do. We found a connection between media intent and:
- Knowledge about candidate stands on issues
- Trust in the media
- Social capital (although the connection is weak and ambiguous)
But the connection is not as clear between media content and these same measures. It could be that our content analysis was faulty. Or it could be that effects are caused not by media coverage in 1996 but by something that came before. In other words, something as yet undefined and undetected (at least by us) appears to be causing both the media intent that we measured and the subsequent citizen effects.
The newspapers that expressed intent to do citizen-based journalism in June of 1996 did not get the idea at just that minute. Something in their organizational culture made them ready for the CBJ approach, and that something might already have had an effect on the community. The fact that trust in media is higher in those counties in both the August and November segments of our panel survey supports this notion.
What is even more likely is that citizen-based journalism involves a basic cultural change that affects both content and a newspaper's relationship to its community in ways that we have not yet learned how to conceptualize, much less measure. The idea offers a fertile field for further research.
1996 POYNTER ELECTION STUDY
MARKETS:
Our study sampled the content of newspapers and commercial television stations in 20 markets across the country. All were in states holding elections for US Senate, most of which were rated "competitive" by Congressional Quarterly. We selected newspapers where we expected to find a variety of coverage approaches, based on their history, public statements of their managers, and their affiliation with non-profit groups that were encouraging experimentation. We tried to balance the sample across a wide range of circulation numbers. Several newspapers were involved in election-coverage partnerships with local television stations, which also became part of our sample. The other television stations were chosen because they had the top-rated local evening newscast, based on publicly-available figures as of January, 1996. (See chart, figure A)
JOURNALISTS:
We conducted a two-stage mail survey of five people in each of the 40 newsrooms. In print newsrooms, surveys were sent to the managing editor, political editor, city editor, chief political reporter and a second political reporter. In broadcast newsrooms, surveys were sent to the news director, assignment manager, evening newscast producer, chief political reporter and a second political reporter. The first survey was conducted over the summer, with a response rate of 70 percent. The second was conducted after the election, with a response rate of 65 percent.
CITIZENS:
We conducted a two-stage telephone survey of at least 50 voting age adults in each of the 20 markets we studied. In order to measure the possible effects of different types of political coverage, we asked a variety of questions about political attitudes, knowledge, and behavior, including voting. Several questions were added to assess what might be called social capital: trust in government and in other people, involvement in the community and in the democratic process. The first survey was conducted August 1-11, 1996, before the first political convention. The second was conducted November 6-17, 1997, immediately after the election. Both surveys were conducted by FGI Integrated Marketing of Chapel Hill, N.C.
COVERAGE:
We collected newspapers and taped television newscasts over a period of seven weeks between Labor Day and Election Day. We collected ten samples at five day intervals, beginning September 19 and concluding November 3, the Sunday prior to the election. All stories regarding either the presidential or senate campaign were then isolated for content analysis.
ANALYSIS:
Two UNC graduate students coded the content of each story in our sample. They noted whether stories dealt primarily with the following topics: specific policy issues in the campaign; character issues; candidates' criticism of one another; campaign hoopla and logistics; campaign contributions; horse-race polls; issue polls; ad critiques; enabling information (to help citizens know how to participate); comparisons of candidates. The coders noted how often citizens were quoted or paraphrased. They also noted the author, length and location of each story, and whether it was a spot-news or feature/in-depth story.
CHART A
NEWS ORGANIZATIONS & MARKET RANKINGS BY CBJ
When TV and newspaper responses are combined (with equal weight), the 20 markets in our study show a good deal of variability. The average score was 2.0476, on a three-point scale of 1 to 3, with 3 being very likely to practice citizen-based journalism and 1, unlikely. Here are the rankings:
CITY
STATE
NEWSPAPERS
TV
RANKING
Portland ME Press Herald WGME 2.6250 Charlotte NC Charlotte Observer WBTV 2.6000 Raleigh NC News and Observer WTVD 2.5286 Wichita KS Wichita Eagle KWCH 2.5274 Boston MA Boston Globe WCVB 2.3929 Chicago IL Sun-Times WMAQ* 2.3095 Minneapolis MN Minneapolis Star-Tribune WCCO 2.2857 Portland OR Oregonian KATU 2.2464 Norfolk VA Virginian Pilot WAVY 2.2321 Columbia SC The State WIS 2.1786 Austin TX Austin American-Statesman KVUE 2.0786 Rockford IL Register-Star WREX 1.8810 Des Moines IA Register KCCI 1.8750 Birmingham AL Birmingham News WBRC 1.8571 Atlanta GA Atlanta Constitution WSB 1.8476 New Orleans LA NO Times Picayune WWL 1.7381 Richmond VA Richmond Times Dispatch WWBT 1.6429 Houston TX Houston Chronicle KTRK 1.5060 Little Rock AR Democrat Gazette KATV 1.3857 Grand Rapids MI Grand Rapids Press WOOD 1.3452 *In Chicago, the top-rated station was WLS, but due to an error our tape sample includes the newscasts of WMAQ.
CHART BNEWSPAPER RANKINGS BY CBJ
Newspapers are more CBJ-oriented than television. Their average score was 2.2285. Here are the rankings of the 20 markets when only newspapers are considered:
Charlotte 3.0000 Norfolk 2.8929 Portland OR 2.7214 Portland ME 2.7143 Wichita 2.6976 Minneapolis 2.6667 Boston 2.6429 Rockford 2.6190 Raleigh 2.5143 Columbia 2.3714 Chicago 2.3095 Des Moines 2.1786 Richmond 2.1429 Atlanta 2.0286 Austin 1.9286 Birmingham 1.7857 New Orleans 1.7143 Houston 1.4405 Little Rock 1.2000 Grand Rapids 1.0000
CHART C
TELEVISION RANKINGS BY CBJ
For the TV stations the mean was 1.8573. Here are the market rankings when only TV intentions are counted:
Raleigh 2.5429 Portland ME 2.5357 Wichita 2.3571 Austin 2.2286 Charlotte 2.2000 Boston 2.1429 Columbia 1.9857 Birmingham 1.9286 Minneapolis 1.9048 Portland OR 1.7714 New Orleans 1.7619 Grand Rapids 1.6905 Atlanta 1.6667 Little Rock 1.5714 Des Moines 1.5714 Houston 1.5714 Norfolk 1.5714 Rockford 1.1429 Richmond 1.1429 Chicago is absent from the television list because we received no responses from the TV station that we sampled there.
This document is part of Prof. Philip Meyer's Web site
at the School of Journalism and Mass Communication,
University of North Carolina at Chapel Hill.
HTML conversion by Bob Stepno, graduate assistant. 18 Mar. 1997