Quack Solution vs. Quack Solution
According to Bad Science:
Translation: Don't use quack solutions from capitalists! Only use quack solutions from governments!
Meanwhile, for over five years now, newspapers and television stations have tried to persuade us, with “science”, that fish-oil pills have been proven to improve children’s school performance, IQ, behaviour, attention, and more. As I have documented with almost farcical repetitiveness in this paper, these so-called “fish-oil trials” were so badly designed that they amounted to little more than a sham. In the case of the biggest, “the Durham trial”, the county council has refused even to release the results, which I have every reason to believe were unflattering.
But I wouldn’t start with molecules, or pills, as a solution to these kinds of problems. The capsules Durham are promoting cost 80p per child per day, while it spends only 65p per child per day on school meals, so you might start there. Or you might restrict junk-food advertising to children, as the government has recently done. You might look at education and awareness about food and diet, as Jamie Oliver recently did very well, without recourse to dodgy pseudoscience or miracle pills.
The blog post did include a reference to a scientific paper:
On the other hand, the actual paper said:
In 2007 the British Medical Journal published a large, well-conducted, randomised controlled trial, performed at lots of different locations, run by publicly funded scientists, that delivered a strikingly positive result: it showed that one treatment could significantly improve children’s antisocial behaviour. The treatment was entirely safe, and the study was even accompanied by a very compelling cost-effectiveness analysis.
Participants 153 parents from socially disadvantaged areas, with children aged 36-59 months at risk of conduct disorder defined by scoring over the clinical cut off on the Eyberg child behaviour inventory. Participants were randomised on a 2:1 basis, 104 to intervention and 49 to remaining on the wait listing (control). Twenty (13%) were lost to follow-up six months later, 18 from the intervention groupThat looks a bit lame. I'm not sure if a sample size of 153 was enough in a study that seemed to indicate we should hire more psychologists, i.e., a study that might confirm the biases of the authors. Besides that, do we have any reason to believe those “lost” were random?