Home » Neuroscience » #SfN14 highlights: Multimodal Investigation of Large-Scale Brain Dynamics: Combining fMRI and Intracranial EEG

#SfN14 highlights: Multimodal Investigation of Large-Scale Brain Dynamics: Combining fMRI and Intracranial EEG

Enter your email address to follow this blog and receive notifications of new posts by email.


007.Minisymposium. Multimodal Investigation of Large-Scale Brain Dynamics: Combining fMRI and Intracranial EEG – Biyu He.
Saturday, Nov 15, 2014, 1:30-4 PM

I made it to DC almost in time to attend all the talks of the mini-symposium on the multimodal investigation of large-scale brain dynamics through functional MRI and intracranial EEG (nice job, Megabus). Altogether, I found that the talks were of a very high technical level, and a good grasp of the analysis techniques generally employed in those studies (especially in intracranial EEG) definitely helped me follow the finer points made by the speakers.

Here are a few highlights from the session. My apologies to the speakers whose talks I did not cover. Note that any inaccuracy or outright misunderstanding in what follows is my responsibility alone!

7.03. Large-scale patterns of cortical rhythmic suppression in human cerebral cortex – Christopher Honey

In his talk, Dr. Honey, from the University of Toronto, presented results on the relationship between cortical low-frequency rhythms (think theta, alpha and low beta, or between 4 and 30 Hz) and high-gamma power (HGP, approximately 70 to 180 Hz), a non-rhythmic portion of the intracranial EEG’s power spectrum that is a good proxy for local neuronal firing. The data came from intracranial electrodes implanted in patients with epilepsy who were going to have surgery in an attempt to remove the focus of their seizures. When the patients were performing an audiovisual task (watching a movie), Honey observed an inverse relationship between the power of the low-frequency alpha rhythm and that of the high-gamma band in the occipital cortex. Similarly, he found anti-correlation between the perirolandic beta rhythm (an oscillation that is most intense when the patients were idling, as opposed to moving their hands for instance) and HGP. At a more global level, he noted that the frequency that was most strongly anti-correlated with HGP varied across cortical areas (it was mostly theta in the temporal lobe and a mixture of theta and low beta in the prefrontal cortex). Crucially, however, that frequency was also the same one that dominated the power spectrum of spontaneous oscillations. Honey found a similar relationship between the low frequencies that displayed the highest modulation of HGP (so-called cross-frequency phase-power coupling) and spontaneous oscillations. He therefore put forward the idea that these low-frequency cortical rhythms may provide pulsed inhibition of local cortical activity (suppressive rhythms). Intriguingly, Honey also observed maximal anti-correlations between the power of low-frequency rhythms and the BOLD signal (collected before the patients were implanted with electrodes), which serves as a reminder that HGP and BOLD probably reflect partly the same underlying neuronal mechanisms.

Note that a question that remains open is why are different cortical areas oscillating at different baseline frequencies. I wonder whether it reflect some basic periodic or rhythmic property of e.g. visual inputs to the occipital cortex or motor outputs from the motor cortex…

7.04. Cognitive electrophysiology of the human medial parietal cortex: Local and network dynamics – B. Foster

Dr. Foster, from Stanford University, focused on the parietal cortex, parts of which have often been observed to be more active when subjects were apparently “not doing much” in the functional MRI scanner (hence they were dubbed part of the so-called default mode network, DMN). It is also known that the parietal cortex–especially the medial aspect, the posterior cingulate cortex (PCC) and retrosplenial cortex (RSC)–also activate during the retrieval of autobiographic memories. Dr. Foster and colleagues therefore designed an intracranial EEG task that would allow disentangling the engagement of those cortical areas during autobiographical retrieval tasks as opposed to more general, non self-centered memory retrieval or arithmetic. He found that, indeed, the PCC and RSC were selectively activated by self-episodic and self-semantic retrieval tasks. Interestingly, the angular gyrus also showed activation, and the two areas displayed correlated activity across single trials, validating the idea that they are part of a coherent functional network during those tasks.

7.05. Cross-frequency coupling in the cortical columnar microcircuit – A. Maier

Dr. Maier, from Vanderbilt University (Nashville, TN), explored cross-frequency phase-power coupling using intracortical recordings across the layers of the cerebral cortex in monkeys (so-called laminar recordings). He found that HGP in the supra- and infra-granular layers of the cerebral cortex are coupled to the phase of alpha oscillations, whereas HGP in the granular layer (that receives feed-forward input from the thalamus) did not display such a relationship. Dr. Maier used current source density (CSD) analysis to determine that the infra-granular layers were likely responsible for the drops in firing rate throughout the thickness of the cortical column. Therefore, he hypothesized, layer 6 neurons might be responsible for transiently shutting down the propagation of feed-forward activity through the cortical column micro-circuit, allowing instead feed-back inputs (that mostly reach the supra- and infra-granular layers) to exert their influence.

7.07. Multimodal imaging of spatio-temporal dynamics in language processing – T. Thesen

Dr. Thesen, from New York University, focuses on the processing of both written and spoken language using a combination of functional MRI, magnetoencephalography (MEG, a technique that records very similar signals to EEG) and intracranial EEG in epilepsy patients. The first experiment that he presented was concerned with written language. Using strings of pseudo-letters vs. pseudo-words made up entirely of consonants vs. actual words, he showed that there was a spatial gradient in the complexity of responses, ranging from specific to real letters vs. pseudo-letters in more posterior parts of the occipito-temporal cortex to specific to real words vs. pseudo-words in more anterior parts of that visual processing stream. Interestingly, Dr. Thesen’s use of electrophysiological techniques also revealed a temporal gradient, in that letter-specific responses started earlier after the appearance of the stimuli than word-specific responses.

In a second study, Dr. Thesen took advantage of the well-characterized McGurk effect (an audiovisual illusion caused by mismatched auditory and visual speech syllables–check out this Youtube video for an example!). When contrasting the response of the brain to audiovisual syllables to the combination of audio-only or video-only stimuli, he found that multisensory effects appeared very early in the auditory cortex (40 ms after sound onset), then a bit later in the left superior temporal sulcus (80 ms), and later again in the left inferior frontal gyrus (120 ms). The very early multisensory effects in the auditory cortex point to a direct, feed-forward effect of visual stimuli through the visual cortex to the early auditory areas. Interestingly, Dr. Thesen’s data even point towards the role of pre-stimulus activity in the visual cortex in the multisensory effects observed in the auditory cortex.

Altogether, a very interesting symposium with talks of a very high technical level that presented for the most part unpublished results. The attendance looked great, with several people sitting on the floor behind the rows of chairs!

Dr. Thesen’s results in particular were very interesting to me, as my own research tackles very similar subjects using similar methods. Check out my poster, Tuesday afternoon, where I’ll be happy to tell you more about this fascinating subject! 623.06/DD4. Phase tracking of visual speech in the human auditory cortex revealed by intracranial EEG.


Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: