Collaborator's Login

The evaluation conundrum: what to do when programs suddenly stop?

9 February 2017

Evaluation teams need to have better strategies to deal with sudden changes in complex programs if we are to build the evidence for these sorts of public health interventions, says evaluation expert Professor Adrian Bauman.

Commenting on the announcement that funding for South Australia’s OPAL (Obesity Prevention and Lifestyle) program will cease in July, Professor Bauman said shifting political priorities meant sudden change should come as no surprise to evaluators of complex programs.

He said it was important to collect impact and process data consistently throughout a project so that up-to-date evaluation was still possible if the project was suddenly dropped.

“There is a conundrum here,” said Professor Bauman, lead of the Prevention Centre’s Rapid Response Evaluation Capacity and Sesquicentenary Professor of Public Health at the University of Sydney.

“There is a lack of evidence for multi-sectoral, multi-year, multi-component programs in public health, and that lack of evidence is made worse because evaluators think it’s going to be a multi-year project so they don’t have the data when the policy/political goalposts are changed overnight.

“Often evaluation is funded by government to collect data at baseline and at the end of a program … but evaluators need to be smarter, to think about getting intermediate measures because there may not be an expected end of the program.”

Largest intervention

The $35 million OPAL program was launched in 2009 as the largest intervention of its type in South Australian history, targeting about 70,000 South Australian children in more than 100 schools with the aim of increasing the number of children in the healthy weight range, and improving diet and physical activity levels. Under an agreement reached between local, state and federal governments, OPAL funded participating local councils to deliver the program for five years.

OPAL continued with modified services and a condensed program after the Federal Government withdrew its agreed funding in 2014, which led to limitations in the evaluation of the program.

The Flinders University OPAL Evaluation Project final report, released in December, found there were no statistically significant changes in the proportion of children of healthy weight by the end of OPAL compared with control communities. The South Australian Government announced in December that funding would conclude, as planned, in June 2017.

Professor Bauman praised the evaluation team for collecting some process data on implementation, and for appreciating the disconnect between hard outcomes such as childhood obesity rates and the large number of years of program implementation required to achieve them.

Positives for communities

SA Health’s Director of Public Health Services Dr Kevin Buckett said that, despite the condensed program, OPAL highlighted a number of positives for local communities. “We will continue to explore community-based obesity prevention programs,” he said.

Professor Bauman said if evaluation of programs like OPAL was to be meaningful in constantly changing political environments, it was important to have a standardised, consistent evaluation team with a stable direction, and a strong emphasis on process (implementation) indicators.

“We also need real partnerships between government departments and research evaluators, which requires a lot of trust and honest communication,” he said.

The OPAL evaluation has been used as a case study in a number of Prevention Centre cross-jurisdiction evaluation forums, which are designed to bring together evaluators from around the country to workshop better ways of evaluating public health programs.

A second component of the OPAL evaluation program is an integrative evaluation currently being completed by the University of South Australia and due to be released mid-year.

Helen Signy, Senior Communications Officer