A major study supported by NCLB has been evaluating the impact of math software in classrooms. I appear to be on a mailing list resulting in my receipt of project summaries (I specifically asked about this research last year). The data from the first year basically showed little benefit. Data from the second year of the study are now available (pdf of executive summary).
A couple of issues were being evaluated – did experience with the software mater, did the impact on achievement vary from product to product.
For sixth grade math, product effects on student test scores were statistically significantly lower (more negative) in the second year than in the first year, and for algebra I, effects on student test scores were statistically significantly higher in the second year than in the first year.
Regarding individual products:
One product had a positive and statistically significant effect. Nine did not have statistically significant effects on test scores.
In 2007, I attended a detailed analyses of the first round of this research at AERA. I anticipate there will be a comparable report this year.