Case Study – How Not to Deliver an Expert Study

Expert Studies "Do's" and "Don'ts" On April 15, 2008, a study entitled "The Taxpayer Costs of Divorce and Unwed Childbearing" was publicly released. Missed it? Well, you're not the only one. In fact, the study received little national media attention, despite (or perhaps because of) its conclusion that divorce and unwed childbearing cost taxpayers $112 billion annually, a potentially explosive claim. So, what happened? Simple really, the team putting together the study didn't ask the questions at the outset of the process that would translate into what they needed to know moving forward. The result: another study sitting on a shelf somewhere instead of being used to move a solution.

Here's what the ideal situation looks like, what this study's backers did and what you can do to avoid their fate:

Choose an expert who is accessible, interesting and not easily impeached

Experts with easily located CVs, wide bodies of work and who are difficult to associate with a larger philosophical or ideological positioning are more likely to be given the benefit of the doubt by key decision makers and otherwise overcome the perception that they are simply for hire and willing to say whatever the people paying them want them to say.

In the case of the divorce cost study, the sponsors selected Benjamin P. Scafidi, an academic economist and former education policy advisor to Georgia Governor Sonny Perdue. The sponsors said that Scafidi was chosen because of "his expertise and [willingness] to invest the serious amount of time and effort to do a careful study and bring in the best scholars as advisors." When questioned, however, Scafidi said his research interests were public finance as well as education and urban policy and acknowledged never researching marriage in this capacity in the past.

In addition, Scafidi's qualifications are not easily accessed online. The Associated Press originally reported that Scafidi was employed at Georgia State University, but later had to submit a correction and reported his employer as the Georgia College and State University. Scafidi's difficult-to-find homepage at the Georgia College and State University lists his Ph.D. from the University of Virginia in 1997, his research interests and three publications. The only other resume available online dates from September 1998.

When choosing an expert, vet them at least as thoroughly as potential detractors might

In this case, it was easier to learn about what Scafidi did as Georgia Governor Sonny Perdue's Education Policy Advisor than it was to get a copy of his CV. And, Scafidi was often quoted supporting Perdue Administration positions that detractors could use to paint him as an ideologue.

Some examples:

In response to dwindling funding for Georgia's HOPE Scholarship in 2003, Scafidi agreed with Perdue's proposed solution of requiring a minimum SAT score of 900 in addition to a 3.0 high school GPA. When questioned about why making access to scholarships harder to attain was wise policy, Scafidi implied that up until the policy change, scholarships were being given to people who would otherwise flunk out of school. Scafidi said that SAT scores are representative of academic achievement:

"If you have low SAT scores, you don't succeed in college. That's unfortunate, but it's a fact." (The Macon Telegraph, "Educators Focus on No Child Left Behind at Rally," October 26, 2003)

Detractors could use Scafidi's comments on class size reduction to paint him into a similar ideological box. Although the Georgia Legislature approved phasing in smaller class sizes in 2000 under Governor Roy Barnes, Perdue delayed the final phase that would reduce the remaining grade class sizes in his 2003, 2004 and 2005 education bills. In support of this position, Scafidi was quoted as saying that reducing class sizes could hurt children:

"It's possible that by cutting class sizes, we can harm average student achievement." (US Fed News, "How Many Kids Can Gov. Perdue Squeeze into Classroom," March 31, 2005)

Offer a solution; don't simply identify a problem

Decision makers know what isn't working; expert studies should demonstrate how to fix what's broken, not quantify the mess.

Despite claims that the Scafidi study was conducted for informational purposes, it was commissioned and paid for by marriage advocacy and religious interest groups. According to the study's backers, Chuck Stetson, a New York investment banker who established the Bible Literacy Project, approached David Blankenhorn, the president of the Institute for American Values, with a grant to conduct a study that would quantitatively determine the cost of fragmented families in America.

At the Washington, DC press conference releasing the study, Blankenhorn said that, while the study itself made no recommendations, its sponsors thought the findings might convince the government that strengthening marriage was a solution to easing the social and economic costs incurred by fragmented families.

When a study fails to identify a solution or appropriate course of corrective action, the likelihood that it will be received as an advocacy piece goes way up at the same time the possibility that decision makers will act is all but eliminated.

Clear, concise and strong; not convoluted, defensive and repetitive

A short study with a clear thesis and methodology will help decision makers reach better (and beneficial) decisions. Studies that repeatedly reference the righteousness or defensibility of the advocated position will insult potential allies and run the risk of embarrassing core supporters.

The Scafidi study and rollout provide a good example of what not to do in this regard. During the press conference announcing his study, Scafidi consistently characterized his conclusion that unwed mothers cost taxpayers $112 billion annually as an underestimate. During his 16 minute public presentation, he spent over one-third of his time defending his estimate as too conservative and mentioned sources of underestimation approximately 14 times. In the 44 page report, Scafidi stated at least 29 times that he underestimated the total.

Download full article as PDF »