The What Works Clearinghouse is funded by the US Department of Education with the aim of sorting through education research and determining which programs are supported by research. It publishes several kinds of reports.
Probably the best known are the Intervention Reports. In these, studies on particular interventions, such as an elementary school mathematics text, are collected and evaluated. To fully meet the WWC evidence standards, a study needs to randomly assign students to the intervention and comparison group and the attrition of both groups should be small. "Semi-experimental studies," in which existing groups of students are carefully matched for demographics and achievement at the starting point, may meet the standards "with reservations."
Unfortunately, relatively few studies pass the WWC’s stringent screen. Thus a decision maker trying to use the reports to, say, pick a mathematics program may come away learning more about the quality of the research than the relative effectiveness of different programs. For an example of how few studies made the cut in elementary mathematics, click here.
One omission in the Intervention Reports is the lack of consideration for possible bias stemming from conflicts of interest. These could include funding from a program's publisher or researchers’ pre-existing preference for a particular philosophy of education. Some reports are strikingly candid that there purpose was to show that a program works.
That said, the What Works Clearinghouse appears to have addressed another possible source of bias. It has been criticized for not limiting its analysis to reports published in peer-reviewed journals. But in spreading its net beyond such journals, the WWC may be compensating for editors' preference to publish articles that show dramatic results over those that find no significant effect.
There are other frustrations when dealing with the WWC:
The WWC also puts out Practice Guides that make a series of recommendations in an area (for example, developing fractions in elementary school). In many ways these Guides are more useful than the Intervention Reports, because the authors make recommendations based on their expertise even in the absence of evidence that meets the WWC screen. In the process, they rate the evidence for each recommendation as weak, moderate, or strong, based on the WWC standards. For example in the Practice Guide on teaching fractions, the authors make five recommendations; they rate two as having moderate evidence and the remainder as weak. The Practice Guides might be viewed as an tacit acknowledgment that insisting on only the most impeccable evidence often leaves little guidance.
Quick Reviews are the third kind of report. These concentrate on a particular study and indicate whether the study meets the WWC evidence standard. Often some time elapses between the listing of research listed in a Quick Review and its appearance in an Intervention Report. For example, a January 2010 Quick Review evaluated a study comparing four math programs. The review listed the four programs (but did not link to the study itself). As of this date (August 2011), one of the four programs was still not listed on the WWC’s page of math programs.