Introduction to SPM Marsbar

Marsbar is an extraction tool designed to output beta estimates or contrast estimates from a region of interest (ROI), a cluster of voxels defined either anatomically, or through an independent contrast. I covered this in an earlier post, but thought that this would lend itself better to a bright, vibrant, visual tutorial, rather than the musty arrow charts.

How to define ROIs from coordinates

How to define ROIs from other contrasts

SPM Jobman


Now that we have created our own .mat files from the SPM GUI and seen how it can be written to the disk, altered, and reloaded back into SPM, the hour is at hand for using the command spm_jobman. This is a command for those eager to disenthrall themselves from the tyranny of graphical interfaces through batching SPM processes from the command line.

I first met spm_jobman - also known as Tim - a few weeks ago at a conference, when I was at the nadir of my sorrows, despairing over whether I would ever be able to run SPM commands without the GUI. Suddenly, like a judge divinely sent in answer to the lamentations of the oppressed, spm_jobman appeared by my side, trig and smartly dressed, and said he would be more than happy to help out; and from my first impression of his bearing and demeanor, I believed I was in the presence of an able and reliable ally. Anyone who has ever met spm_jobman, I believe, has felt the same thing. However, as I learned too late, far from being a delight, he is a charmless psychopath; and he continues to infect my dreams with nameless horrors and the unrelenting screams of the abattoir.

spm_jobman has three main options to choose from: Interactive, serial, and run. After choosing one of these options, for the second argument you enter your jobs structure, which is automatically populated after loading the .mat file from the command line. Interactive will load the traditional GUI with the options filled in from the jobs structure, which you can then modify and execute as you please; Serial will prompt the user to fill in each field, with the defaults set to the values in the jobs structure; and Run will execute the jobs structure without cuing the GUI. For most purposes, if you decide to run spm_jobman at all, you will want to use the Run command, as this allows you to loop processes over subjects without pause, allowing you to do more useful tasks, such as Googling the history of the lint roller.

Saving .mat files from SPM is immensely helpful in understanding the relationship between the .mat files created by SPM, and what exactly goes into them; and this will in turn reinforce your understanding of and ability to manipulate Matlab structures. The following tutorials show how the .mat file is generated from the SPM interface, which can then be used as a template for spm_jobman. I've been working with SPM for years now, but found out about this only recently; and I hope that it helps ease the burden of your SPM endeavors.



Slice Timing Correction in SPM

I have posted a couple new videos about slice-timing correction in SPM: One from the GUI, and the other from the command line. The command line usage is more interesting and informative, especially if you aim to batch your preprocessing without using the graphical user interface; and this will be the goal of this series of tutorials.

And just imagine - no more mindless pointing and clicking. No more sore wrists and carpal tunnel syndrome from those long nights of copying and pasting onset times, one after the other, subject after subject, until your mind becomes so warped that you accidentally end up copying and pasting a particularly spicy, shockingly personal, yet oddly poetic missive sent to an ex-girlfriend after quaffing one too many Smirnoff Ices, which then ends up estimating a general linear model of your pathetic and utter wretchedness. Obviously, this analysis will go into the supplementary materials.

To avoid this, slice timing can instead be called by the program spm_slice_timing, which requires the following arguments:

P - A list of files to slice time correct (can select these using spm_select)
sliceOrder - Slice acquisition order
refslice - Reference slice for time zero
timing - requires two arguments: 1) time between slices; and 2) time between last slice and next volume

sliceOrder can be assigned with a Matlab concatenation command. For example, if the slices were acquired in an interleaved order starting with slice 1, and there were 35 slice total, the slice order could be written like this:

sliceOrder = [1:2:35 2:2:35];

Which would return a list of numbers going from 1 to 35 by steps of 2, and then going back and concatenating this with a list of numbers from 2 to 35 by steps of 2.

The timing variable is easy to fill in once you have both the TR and the TA variables. TR is your repetition time - for example, 2 seconds between volumes. TA is defined as TR-(TR/(num. of slices)), which in this case would be 2-(2/35) ~ 1.94. This is the time at which the last slice was acquired; and, since the first slice was acquired at time 0, the time between each slice can be calculated as TA/nSlices, e.g. 1.94/(nSlices-1) = 1.94/34 ~ 0.057 (not significant, but trending towards it). Likewise, the value for the second field can be calculated as TR-TA, which also equals about 0.057. If the variables TR and TA have already been assigned values, then the fields of the timing variable can be filled up:

timing(1) = TA/nslices;
timing(2) = TR-TA;

With this in hand, spm_slice_timing can be filled in as follows:

spm_slice_timing(spm_select('List', pwd, '^r01.nii'), [1:2:35 2:2:35], 0, timing)

Both versions of slice timing correction can be found in the following tutorials:



The GUI version of slice timing correction. This is for little kids and grandmas.


Now we're talking; this is the real sh*t. Henceforth shall you be set on the path towards nerd glory, and your exploits shall be recorded in the blank verse epic, Childe Roland to the Nerd Tower Came.

Brown Sugar Glazed Salmon

In addition to analyzing FMRI data, I also do other things, such as eating. Recently I came across a wonderful recipe for glazing salmon with a mixture of honey, butter, brown sugar, and Dijon mustard, which turned out to be quick, easy, and delicious. For sides add some mashed potatoes and asparagus, and you have a meal substantial enough to share with guests, or to make leftovers to last for a couple days. The finished product looks like this:



No, wait; wrong picture! It should look something more like this:


Pairs well with pinot grigio, or a forty-ounce of your favorite malt liquor. To enhance the experience, eat while reading a salmon-related paper regarding false positives in neuroimaging data.

The recipe; the paper.

Checking Image Registration

Visually checking your image registration - in other words, the overlap between images aligned in a common space - is one of the staples of any FMRI analysis pipeline. Sadly, although everyone says that you should do it, not many people go through the trouble of looking of visually inspecting image overlap; even though, in my opinion, I think that checking your image registration is one of the sexiest things you can do. "Sexier than brushing your teeth at least once a week?" Not quite, but we gettin' there!


Example of faulty image registration. Each set of orthogonal views represents a single subject's mask after first-level analysis. The orthogonal views highlighted with the yellow rectangle has suffered - terribly - from an error in image registration during preprocessing, which should be inspected further before doing a higher-level analysis.
In order to increase your attractiveness, I've written up a brief script - possibly my masterpiece - which will allow you to easily check the registration of a specified image. For example, you may want to check the masks for a group of subjects to make sure that they overlap, as a single mask which is far different from the others will lead to a truncated group mask. While not necessarily unsexy, missteps like these will only lead to average attractiveness. In other words, there's not anything really wrong with you, and there might be plenty of people who would settle for you, but...meh.


More details, including a demonstration of the script in action, can be seen in the following video.


John's Gems #5: Mesh Plots of T-Maps

For those of us who wish to act like we know more than we actually do, John Ashburner has a webpage of useful code snippets that allow you to do cool - maybe even useful - operations on your data. By using these gems, others may start to think that you are also cool - and maybe even useful.

Here I merely replicate John's Gem #5: Mesh Plots of T-maps. (Because coming up with something original on my own would be, like, you know, difficult.) While I cannot immediately think of a practical utility for this function, it does provide an interesting visualization of t-statistics as viewed through a transverse slice of the brain, as shown in the following figures:

Left: Transverse slice of a T-map seen from above. Right: Same image viewed at an angle. Note that the "hotter" spots are raised above the yellow plane threshold representing a t-score of 0, while "colder" spots deflect below the plane.

I recommend checking out more of the code snippets, as they provide insight into how the professional neuroimaging programmers approach problems. By immersing yourself within their thoughts, it will have a salutary effect on your thoughts as well, cleansing your muddled mind and training you to become focused, forceful, and direct. At least, that's what I think might happen.


SPM Tutorial 1: The Basics

As the majority of neuroimagers use SPM for data analysis, I have begun on another set of introductory tutorials geared toward the beginning SPM user. SPM is what is used in my lab (despite my plugging for AFNI on a regular basis), and while I believe it has its shortcomings - just like any other data analysis package - it has several benefits as well.

Most important, it is run through the Matlab interface. While this may be seen as a hinderance - especially since Matlab is commercial software, thereby making the price of Matlab the price of SPM - I believe that several advantages derive from using Matlab. First, Matlab is an important piece of software that not only serves as the workhorse for SPM, but also allows more complex and sophisticated data analysis, whether that data has been processed in AFNI, FSL, or any other package. Second, while SPM can be used solely through the graphical interface for most purposes, Matlab allows the user to run automated processes from the command line; and a better grasp of the Matlab syntax will make one a better programmer, in addition to strengthening the intuition between what is output to the Matlab terminal and what is being processed within SPM. Last, Matlab's use of matrices provides a clearer link between the raw FMRI data and the operations performed on that data. While Unix can simulate matrices through complex array computations (at least, I think - I've never tried it), the matrices output into Matlab are easier to comprehend and manipulate.

Because of this last point, I believe that SPM has a distinct advantage over the other packages. However, its benefits will disclose themselves only to the diligent, inquiring user who desires to harness the power of Matlab to augment and enhance their data analysis, rather than merely leaving it as a background process to be taken care of by the facade of SPM's graphical user interface. The only drawback is that, for the neuroimager who has been using a bash shell his entire life, learning a new programming environment can be daunting, irritating, and, in some cases, life-threatening. However, there is no need to fear - for those with extensive programming experience, even within one programming language, there are several crossovers into Matlab; and even for programming novitiates, I believe that Matlab can provide a safe, warm, womb-like environment to guide you over the initial hurdles of programming.




By beginning another series on a third software package, one may well ask whether there will ever be any sort of order imposed on this scattering of walkthroughs and tutorials. I admit that I make them more or less as they come to me as I desire, often in thrall of a laudanum-steeped vision; and that it does, in fact, feel as though I am merely binding together several of my most unruly children under one roof. Over the next few months I intend to create a stricter hierarchy for what tutorials should follow which, and I intend to create more ordered playlists that users can click through; but for now, it is an information dump. Partly to help others, yet more often to help me, as I remember material much better if I teach it. But are not the most rewarding acts those which satisfy the needs of all involved?

Group-Level Smoothness Estimation in SPM


Fathers and teachers, I ponder: What is hell? I maintain that it is the suffering of being unable to love. Oh, and it also might be the fact that you can't get an accurate smoothness estimate on second-level results. I mean really, what's up with that?
-Father Zossima

Last week I received an email from a fellow neuroimager asking about smoothness estimates for second-level results. As I discussed in a previous post, this is an important question when calculating cluster correction thresholds, as the smoothing kernel applied to FMRI data is not the same smoothness that should go into cluster correction estimates; and failing to account for this will lead to shame and ridicule. (The exception to this would be using AFNI's 3dBlurInMask, but I assume that most people use either AFNI's 3dmerge or SPM's spm_smooth function.)

To highlight this distinction, imagine a cake - the cake itself is the smoothness on the images that come out of the scanner. Now, pipe some frosting onto the cake. That frosting is the extra smoothing applied on top of those images. The baker piping the frosting is the analysis package you are using, and you can either tell him to add more frosting or less. Sometimes he will malfunction and give you an error message, and you don't understand why he is not working, but you cannot fire him, as you have no other alternative. In any case, when someone asks you how many calories are in the cake, you do not lie and tell them only the calories that are in the frosting; you must account for the underlying cake as well. And then there are times where you will dip your finger into the frosting, just to get a little taste, but your pudgy fingers leave a noticeable divot; and you attempt to fill in the gap with frosting from the edges, but it only ends up looking worse.

In sum, if you are using SPM to estimate smoothness for second-level contrasts or ANOVAs, do not simply use spm_est_smoothness on the ResMS.hdr images output from that analysis. I have tried and I have failed to get any meaningful results, and I have also been unable to find an answer on the message boards or listservs. Therefore, instead of focusing on the second-level analyses, I recommend averaging the smoothness estimates across each subject, and using those averages for your cluster correction estimates. This can be done by looping over each subject, applying spm_est_smoothness to each individual's ResMS.hdr file, and storing the results in a matrix, after which the results are averaged in the x-, y-, and z-directions. The following .m file allows you to do this.

I confess that I know little else about smoothing; this was the only reasonable approach that I could find, and I have little intuition about why spm_est_smoothness gives such wonky results on second-level residuals datasets. It may be that these are to remain mysteries, too profound for our common clay.

Thanks to Jillian Hardee, who never smoothed a dataset she didn't like.


Updates

Lately on this blog there has been a slowdown in productivity; I apologize. Three-fifths is due to negligence, neglect, and the active avoidance of posting, while the other two-fifths has been sheer laziness. As with everyone, there seems to come a time where there intersects the demands of work, extracurriculars, and catching up on Starcraft 2 replays; and the goal is to take care of those duties and then resume the previous levels of productivity, while taking care not to let the current pace become the new norm.

However, I want to be upfront and say that it is accompanying which has demanded most of my attention these past few weeks. At first I thought I would only be assisting one student this semester; but apparently word got out that I would play almost anything, and at a reasonable price (if I like the piece well enough, usually for free), and I now count three names in my daily planner.

Not that I complain; I consider accompanying to be one of the most pleasant, agreeable, and ennobling activities that I have the privilege to share with others. Notwithstanding the scheduling conflicts and sprints across campus to make it from a scanning session to an afternoon lesson, it can be great fun; and frequently have I felt both the prolonged rush of adrenaline after a session of Shostakovich and the slow, satisfying burning in my bones long after playing through one of Haydn's sublime adagios; more than once has there been a hectic day at work where I manage to slip out for a quick connection with Mendelssohn, and for the rest of the afternoon I walk on clouds.

What follows are a few of the pieces I am currently working on, along with some annotations for the curious pianist:

Grieg: Cello Sonata in A Minor, op. 36

When I got the call asking for an accompanist for the Grieg sonata, at first I was reluctant; the piano part seemed difficult to put together, and I wasn't sure I would be able to get it up to the tempo written in the score. However, I should have known better - pianists who write the accompanying scores know how to make you sound good without too much difficulty. After a few sightreadings and some encouragement from my fullblooded Norwegian grandparents, I knew that I had to play it.

Of note is the last couple dozen bars of the first movement, where you are required to slam out elevenths while playing prestissimo, immediately followed by arpeggios flying through all the registers, which makes you feel like a total boss. Coupled with a quotation from the opening bars of Grieg's famous piano concerto, the badassness of this piece is therefore raised to 100,000. Highly recommended.

Hair music


Strauss: Cello Sonata in F, op. 6

Whereas Grieg was professional pianist, Strauss was not; and although I love this piece very much, and although I regard Strauss as a musical genius, the piano part is irritatingly tricky, complex, and in some places can be downright treacherous. Consider the fugue in the first movement during the development; I have had this piece in my repertoire for almost two years now, and I still have no idea what the hell kind of fingering to use. Due to passages like these my score contains several reductions, ad hoc ritardandi, and entire clusters of notes crossed out and obliterated; yet it still remains a difficult piece to pull off. And then there are the other two movements, which I don't even want to think about.

Regarding the pace of the first movement, I have listened to several recordings where for some reason during certain sections the tempo goes off the rails. I recommend sticking close to the original 168 beats per minute (as in the following Hamelin and Rolland recording), as the overture introduction sounds grander and more dramatic, and the militaristic second theme sounds more solid and self-assured, as opposed to frantic. Also keep in mind that during the coda you need to have a lot left in the tank to power through the stretto into the piu mosso, and still get faster after that. Don't be a victim.



Schumann: Adagio and Allegro for Cello and Piano, op. 70

Schumann is a rare bird, a bit gamy and not to everyone's taste. For me, at least, it took several years to warm up to his enigmatic style, and there are still times where I find his music a little too lush. However, the more I listen to his compositions - and especially his later work - the more intrigued and enthralled I become.

The Adagio and Allegro - originally written for horn and piano, and originally titled Romance and Allegro - is a charming piece, juxtaposing a dreamy, ethereal Adagio against the victorious, conquering Allegro. Florestan and Eusebius, the imaginary and opposing characters of Schumann's personality, are at play here, especially within the contrasting subsections of the Allegro; both performers would do well to try and elicit both characters.


Mendelssohn: Lied ohne Worte, op. 109

Every once in a while an instrumentalist will take mercy on you and select a piece that is well within your ability. As with Mendelssohn's famous Songs without Words for piano solo, the Lied ohne Worte for cello and piano is a brief character piece containing several of the elements of the Songs; a delicate melodic line floating over a simple accompaniment, but concealing deeper emotions which erupt to the surface during the agitato middle section. Very fun to play, and an excellent introduction to chamber music for the aspiring pianist.


Any damn way, that's what I've been up to in my free time. I have some SPM videos coming down the pike, as well as an overview of the AFNI uber_subject scripts, so stay tuned.

SSH Tunneling: Shells and VNC Displays

Your graduate work may be insufferably boring, but at least you can prolong the agony from the comfort of your own home.

SSH (Secure SHell) tunneling allows you to connect to a host computer (such as a server) and execute commands remotely from your local computer. In every lab I've worked in, each lab computer connects to a server that everyone can work on; however, you can do this from any location with an SSH terminal. Simply open up a command terminal either using Macintosh's built-in shell or Microsoft's Cygwin emulator, and type in the following command:

ssh username@remoteHost

Where username is your username, and remoteHost is the name or IP address of a remote computer (e.g., a server). Once you have connected simply enter your password and you will be able to navigate around and execute commands on the remote computer.

Should you desire a VNC (Virtual Network Computing) screen - a program which shows you the same desktop and windows as if you were logged in to the remote computer itself - download a VNC client, either realVNC for windows or Chicken of the VNC (I know, right?) for Mac. Then, execute the same command as above, but with the following:

ssh username@remoteHost -L 5915:localhost:5909

In this example, port 5915 is used on the local machine to tunnel into port 5909 on the remote machine. (The port used for the display on your remote server may vary; when connecting to other computers, usually port 22 is the TCP standard for SSH connections.) Once this connection is established, open up your VNC client, set "localhost" as the host you wish to connect to, and connect to the port you opened on your local machine (in this example, port 15, since display 15 is represented by port 5915):


Then hit Connect, enter your password, and you will have a display open on your server. That's all there is to it! Now, instead of using your domestic hours for rest, solitude, and reflection, you can keep toiling endlessly,  your eyes burning and your head splitting from overwork, the cartilage of your phalanges chafed and torn from overuse, as you rush to meet deadlines both real and self-imposed. Your struggles are futile and your heart is faint; to the passing observer your efforts are met with the same mixture of horror and morbid curiosity you would feel watching a lunatic bash his head against the padded walls of his cell. Enjoy.