Do Denisonians Have Informed Opinions about Foreign Aid?

Miles D. Williams, Visiting Assistant Professor of Data for Political Research

[Note: This is a version of a post Dr. Williams did for Foreign Figures]

Survey experiments are a technique that social scientists use when they want to measure something hard to ask people directly about or when they want to identify causal effects. I almost never use them. This isn’t because I don’t think they’re valuable. I just tend to ask questions that require macro, country-level data to answer.

But I broke with this trend recently, and I had fun doing it. Every semester, the 127 blog fields a student survey to get a pulse on a range of attitudes, identities, and behaviors on campus. Some of the questions deal with politics, some with religion, and some about mental health and study habits. This iteration of the survey, I asked if we could embed an informational experiment about foreign aid.

Surveys show that one of the reasons people tend to favor cuts to foreign aid is that they overestimate how much the US federal government spends on it. As a result, previous experiments find that when you inform people about the share of the federal budget aid actually comprises, there’s a significant decline in opposition to aid spending.

I wanted to see if this replicates among students at Denison. Denison is a weird place. It’s a smallish liberal arts college, with a little over 2,000 students enrolled, and it exclusively offers undergraduate degrees. It’s also private, and it’s highly selective. This year it had a 17% acceptance rate. Its student body, therefore, isn’t exactly representative of the typical American, nor of the typical college student. But does this weirdness affect Denison students’ susceptibility to the same misconceptions about foreign aid that are pervasive across the country? I wanted to find out, and a survey experiment seemed like the best approach.

273 students participated in the survey, which was fielded earlier this month (April). Typically, we get closer to 500 responses, so 273 isn’t great. Usually this survey runs in March, but since it was delayed, its timing probably coincided with all the hustle and worry associated with the last month of the semester. Also, the weather is nicer, so who wants to sit in their dorm and take a long survey?

No matter, 273 students were still enough for me to get good results for my experiment. Here’s how it worked. All students were asked whether they think the US spends too much, just the right amount, or not enough on foreign aid, but half were randomly assigned to get an extra bit of context with the question. They were told that foreign aid typically makes up about 1% of the federal budget. By randomizing who got this information, it makes it possible to estimate the causal effect of this information on how people responded to the question about aid.

Alright, let’s get to the interesting part. Did the experimental condition change responses to the question about aid? First, take a look at the overall responses to the aid question in the below graph. Only a quarter of students who took the survey said the US spends too much on aid. This rate is far less than what you’d find in the general population. Denisonians appear far less aid skeptical than the typical American. Over a third said that the US doesn’t spend enough on aid, and little over 41% said the US spends just the right amount.

My main takeaway from the data so far is that Denison students are, indeed, weird. But the results from the survey experiment, shown below, indicate that they’re only weird up to a point. The next figure breaks down student responses by treatment status — those told foreign aid typically comprises 1% of the federal budget versus those not given this information. Exposure to this information changed the likelihood that a student felt the US spends too much on aid, and the likelihood that a student felt the US spends too little. These differences are statistically significant, as indicated by the fact that the confidence intervals for these responses don’t overlap. 14.3% of students who were told the US spends 1% of its budget on aid said it spent too much, compared with 34.8% in the control condition. That’s nearly a 20 percentage point difference, which is massive in the social sciences. Also, 42.1% of students in the information treatment said the US doesn’t spend enough on aid, compared with 26.5% of students in the control condition. That’s about a 15 percentage point difference. “Just enough” was the only response category where there wasn’t a statistically detectable change in attitudes.

This result tells me that a good number of Denisonians overestimate how much the US spends on foreign aid. But once you tell them how much the US actually spends, they change their attitude.

By the way, the size of the effect — an approximately 20 percentage point drop in believing the US spends too much on aid — is small compared to findings in published research. One study found that opposition to aid dropped from 67% to 28% (a 40 percentage point change) when people were exposed to information about its cost. This is another sign that the typical Denisonian starts out better informed about aid than the typical American.

Wrapping up, findings in social science don’t always replicate with new data in new settings. The effect that information about the cost of foreign aid has on aid attitudes seems to be one of the exceptions. What makes this simple informational intervention so powerful?

I don’t know for sure, but my guess is that it has to do with how much the average person overestimates the cost of foreign aid. When asked, Americans tend to say foreign aid comprises 25% of the federal budget. The true amount (1% of the budget) is 25 times smaller than what most people think. Such a large discrepancy between beliefs and reality is just begging for a sizable reevaluation of attitudes.

But I think the reason informational interventions have baked-in effectiveness in changing aid attitudes is a double-edged sword. My bold assertion is that if people get the amount spent on aid so completely wrong so often, they are not just poorly informed, they also don’t care that much. It’s trivial to google the aid budget – it’s not a state secret. But very few people bother to look this information up.

This raises an important practical question. If you tell enough people what the US aid budget really has been, will they care enough to push back against the shuttering of most US foreign aid projects? Getting more people to think we spend the right amount, or not enough, on aid isn’t the same thing as getting people to mobilize. While the effect of these informational interventions appears massive, I have doubts about their practical significance.

Miles D. Williams (“DrDr”) is an avid gym rat and wannabe metal guitarist who teaches courses for Data for Political Research. He writes about data and international relations for his bi-weekly newsletter, Foreign Figures.

Leave a comment