Updated Functional Connectivity Tutorial using AFNI

As part of a new course in neuroimaging methods at Haskins Laboratories, I've begun updating videos on topics such as functional connectivity, context-dependent correlations, and how to accept bribes as a reviewer. The first topic we've covered is resting-state functional connectivity, a sophisticated-sounding name designed to make the subject think he is doing something of immense scientific importance by lying still and doing nothing, when in reality it's to distract him while we find out how to hell to hook up the experimental laptop.

Aside from its usefulness as a stall tactic, resting-state connectivity can also reveal resting-state networks, or correlations between the signal of distant regions of the brain. This provides clues to how structural connectivity - i.e., white matter connections - interact with the BOLD signal, as well as whether differences in resting-state connectivity is a marker for mental disorders such as Alzheimer's or schizophrenia.

The following video takes you step-by-step through functional connectivity analysis, using an online dataset from openfmri.org. One major change from my previous tutorials is condensing all the information into one long video, and providing time markers for each segment in the "Show More" box. This way the viewer can jump around to the information that they need, without having to keep track of several different videos detailing different steps. I hope it's an improvement, and I would like to get feedback.


I've also posted the lecture on resting-state analysis given at Haskins Laboratories on November 3rd. You won't learn much new here that isn't in the video above, but it does have more information. For most of the lecture you can only see the top of my head bobbing around, but that's OK. Eyes on the slides, not the hair.


Exercises:


  1. Set the errts dataset as the underlay, and select "Graph". From the "Opt" menu, select "Write Center." Rename the output 1D file, and use 1dplot to see the timecourse. This can be used as a seed for another connectivity analysis.
  2. Other resting state networks include the somatosensory network, the visual network, and the language network. Research one of these networks, determine where the hubs are, and run a resting state analysis on a seed placed in that hub.
  3. Run correlations for a group of subjects, convert to z-scores, and do a second-level t-test using uber_ttest.py.
  4. Modify the afni_proc.py script to apply 3dRSFC to your data (see Example 10b in afni_proc.py -help)


Blistering Fast Functional Connectivity Analysis with 3dTproject

Because speed is associated with freedom, happiness, and the American way - we want results, we want visible results, and we want them now, darn it! - it is no surprise that neuroimaging analysis is becoming exponentially faster and more efficient. Gone are the days when you could run a slice-timing step, go to bed, and wake up eight hours later when it finally finished. Most likely it crashed, meaning you had to run the entire thing all over again, but that didn't matter - the point was that you at least felt as though you were doing something during that time.

Regardless, we are approaching a radically different era now; and one of the harbingers of that era is AFNI's 3dTproject. Released a few months ago, this tool is now the default in both uber_subject.py and afni_proc.py when you do resting state analyses. It's quicker, more efficient, and allows fewer chances to mess things up, which is a good thing.

To include 3dTproject in your analysis pipeline, simply apply "rest" to the analysis initialization screen when running uber_subject.py, or copy example #9 from the help documentation of afni_proc.py. Under no circumstances should you try running 3dTproject manually, since you, possessing the hand-eye coordination of a cheese burrito, will inevitably make a typo and screw something up. Nevertheless, if you insist on doing it yourself, I recommend using the following code that is automatically generated by the .py scripts:

3dTproject -polort 0 -input pb04.$subj.r*.blur+tlrc.HEAD -censor motion_${subj}_censor.1D -cenmode ZERO -ort X.nocensor.xmat.1D -prefix errts.${subj}.tproject

Needless to say, you will have to have already run your preprocessing steps and have run 3dDeconvolve with the -x1D_stop option to generate the necessary matrices.