While there was much anticipation within the food and nutrition community for the release of the Dietary Guidelines for Americans, the Advisory Committee Report released in June 2010 prior to the Guidelines also broke new ground. For the first time in the 30 years that the Dietary Guidelines have been published, the report was entirely evidence-based, with rigorous application of a new systematic review process developed as part of the U.S. Department of Agriculture’s Nutrition Evidence Library.
In addition, the recommendations put prevention at the top of the list of priorities for many stakeholders. And as the focus on prevention initiatives increases, particularly on local and community levels, so will the competition for funding opportunities to develop and implement prevention initiatives. “Evidence and evaluation allow us to promote the value of prevention in terms of lives saved, diseases prevented and dollars and cents saved in health care costs,” says registered dietitian Elvira Souza, MPH, MS, RD, public policy chair of the Public Health and Community Nutrition dietetic practice group. “Providing the evidence that programs are effective puts us in a better position to compete for resources.”
However, when funders, legislators or even other researchers want irrefutable proof that an otherwise inevitable future event was evaded, some of the challenges of proving prevention emerge. “One touchy aspect of this endeavor is that one never knows what would have happened without preventive protocols,” says medical researcher Tom Wnorowski, PhD, CNCC. “Without trials of some sort, convincing a hard scientist can be difficult. If you drive to work and do not have an accident, what did you do to prevent it? If you go the same route, follow the same routine and keep everything else constant, how do you explain the fender bender from the man who rear-ended you?”
“Sometimes professionals, in addition to lay people who are tuned into the scientific method, run the risk of always thinking in terms of treatment and control,” says registered dietitian Suzanne Pelican, MS, RD, a recently retired food and nutrition specialist from the University of Wyoming Cooperative Extension Service. “It is the way many of us were trained to evaluate data, and that is just not the way it works in large-scale community-based interventions, where you have people who are free living surrounded by lots of variables that cannot be controlled.”
In addition, pre- and post-intervention biomarker measurements can help demonstrate efficacy, but empirical science generally requires a control group—which most community nutrition programs will not have. “You can’t withhold intervention from people to see if there is a different health outcome,” says registered dietitian Robin Foroutan, RD, communications chair of the Dietitians in Integrative and Functional Medicine dietetic practice group. Preventive care is a major component of functional medicine—a field that, according to Foroutan, faces many of the same challenges as prevention when it comes to generating evidence-based research.
“We address what is regarded as pre-disease conditions and that, when corrected, theoretically stop the path of disease,” says Foroutan. But another factor—and central tenet of functional medicine—is the individual’s genetic uniqueness. “In terms of research and evaluation, this proves to be a challenge because it means that different therapies work for different people.”
Despite the challenges associated with proving that prevention works, registered dietitians are finding ways to address these challenges through strategies in program design and evaluation. Tips include:
- Eliminate assumptions and assess actual needs: To be effective, you need to truly understand the people and the underlying causes of an issue in a community.
- Focus on translational research: Be sure your intervention has fidelity—that you can sustain the intervention over a long period of time and that the people in your study actually are able to make those changes.
- Use best practices and adapt interventions from similar programs: The inclusion of evidence-based prevention strategies and techniques may not only strengthen the program and increase its effectiveness, but provide funders with science-based evidence of its preventive effectiveness even before outcomes can be measured.
- Build ongoing evaluation into your design: Veterans in the field emphasize the importance of determining how a program’s effectiveness will be measured, particularly if you must supply annual data in order to receive further support. And consult a statistician, who will use data from other studies to calculate how many participants are needed and how long the study must last in order to show whether there has been any impact.
Generate PR for your program: Personalize your reports with stories or profiles of participants in the program, and include emotion-based messaging. But stories about lives changed remind legislators, funders and ourselves that prevention is about people, not just numbers.