|Figure 1: Why you are (probably) a slacker|
Taken from Carp, 2012
A recent article in NeuroImage suggests that we cognitive neuroscientists aren't providing our methods in adequate detail when we submit articles for publication. Is it because we're all lazy, shiftless academic scum? Absolutely.
However, that is only part of the problem, according to the publication author, Joshua Carp. Because neuroimaging processing pipelines are so complex and multifaceted, the number of unique ways to process a given set of data rapidly approaches infinity (or at least several thousand). Of course, there are more reasonable ways to analyze data than others; for example, you probably wouldn't want to smooth your data before doing registration, because that would be straight-up bonkers. However, it can be done, and, based on the article, a substantial fraction of studies published from 2007 to 2011 do not provide adequate information for replication, leaving it an open question for whether a failure to replicate a given result represents a failure of the replication study, or whether the initial result was a false positive. Furthermore, this increased flexibility in processing leads to a greater likelihood of producing false positives, and makes it less likely that another lab will be able to reproduce a given experiment (see, for example, Ioannidis's 2005 paper on the subject).
To mitigate this problem, Carp suggests that neuroimagers adhere to the guidelines provided by Russ Poldrack in his 2008 paper, including providing details about number of subjects, number of runs (a surprising amount of studies did not report this, according to Carp), how ROI analyses were performed, how to present results, and so forth. I agree that these are all important things to report, although I also happen to agree with Carp that some of the more technical details, such as signal scaling, are provided as a default in programs such as SPM and often aren't given another thought; whereas something like smoothing, which has a higher likelihood of being changed based on the question being addressed, is more often to be reported. Thus Carp's suggestion that more journals provide a supplementary material section for all of the nitty-gritty details, I believe, is a good one.
In any case, another timely reminder (for me at least) about methods reporting and why it matters.