01 LanguageLogic.

In studying cortical dynamics of language function and dysfunction, our approach has been to investigate in parallel non-impaired subjects and subjects who have a decifit in a specific aspect of language processing which is essentially functional, with no clear-cut structural correlates. So, for reading, we compare fluent readers and dyslexic subjects, and for speech production fluent speakers and developmental stutterers. Comparison of cortical activation sequences between these subject groups may clearly inform us of those cortical areas and time windows which are particularly relevant or even critical for fluent reading and speech. We can then extend our studies to aphasic subjects, in whom the language impairment is caused by structural lesions, and several aspects of language processing tend to be disrupted simultaneously.

02 ExpParadigms.

For this talk, I have chosen a selection of reading tasks, both silent and overt reading. I will begin by comparing cortical activity in fluent readers and dyslexic subjects during word perception and comprehension. For reading words aloud, the comparison is between fluent speakers and stutterers. Finally, I will show an example of characterizing cortical dynamics of reading in an aphasic subject.

03 DyslexiaStimuli.

In this first study, we wanted to have an overall view of silent reading of single words in dyslexic and non-impaired subjects. The dyslexic and control subjects were matched for age and level of education. These dyslexics are all adults and they have eventually learned to read fluently but they have remained definitely slow compared with normal readers. Finnish dyslexics typically do not have particular problems with nonwords, because of the one-to-one correspondence between phonemes and graphemes.

The subjects were shown 7-8 letter Finnish words, one word at a time, every 3 seconds. We also showed nonwords but as the results relevant for this talk were the same for all word types, I will concentrate here on real words. To keep the subjects alert, they were instructed to say aloud the rarely occurring word 'giraffe' which was not included in the analysis. Responses to about 100 stimuli were averaged to obtain a good signal-to-noise ratio.

04 VectorView.

And here is a view of our whole-head MEG system. The magnetic sensors are arranged in the shape of a helmet. The sensors must be stored at very low temperature to keep them superconducting and that is why they are placed in this huge thermos bottle. The subject is seated on the chair which is lifted to bring the head close to the sensors and the measurement can begin.

05 WordsSensorsSources.

These curves show variation of cerebral magnetic fields over a period of 1.2 s in a fluent reader. The measurement helmet is viewed from above and flattened onto a plane, with the nose pointing upwards. Time runs along the horizontal axis and magnetic field changes in the vertical direction.

Our MEG system uses planar gradiometers which detect the maximum signal directly above the active cortical areas. In each measurement location, there are two sensors, with the upper ones most sensitive to longitudinally oriented currents and the lower ones to latitudinally oriented currents.

From these signals, we determine the active brain areas and their time behaviour. Here, these seven source areas accounted for most of the activity, also indicated by the goodness-of-fit value below. After this point we can forget about the original MEG sensor outputs and concentrate on the time behaviour of these 7 active areas in the brain. The dots indicate the centres of the active areas, and the tails show the orientation of current flow. The posterior areas were active first, one after the other, for 100-200 ms. The frontal areas started to participate later and remained active for several hundred milliseconds.

A similar analysis was performed for all 14 subjects individually. To compare activations across subjects, the source areas of each subject were colour coded according to the latency when they showed maximum activation.

06 WordsSequence.

This slide summarize the cortical activation sequence in this word reading task in fluent readers. Each dot shows the center of an active cortical area and these dots have been collected from all our 8 fluent readers. The different colours indicate successive 200-ms time windows. Time 0 denotes the appearance of the word on the screen. During the first 200 ms, activity concentrated to the occipital midline and to the inferior occipitotemporal cortices bilaterally. Thereafter, the left superior temporal cortex and the left and right sensorimotor cortices showed activation. The left temporal signals still persisted after 400 ms.

Of course, from this study alone, we were not able to assign specific functions to these responses. At the time of this study, intracranial recordings by Nobre and others had reported a letter-string specific response in the region of the inferior temporal sulcus and fusiform gyrus at about 200 ms after stimulus onset. It was very possible that we were recording the same response here. The subsequent lateralized left temporal lobe activation could have been associated with lexical-semantic or phonological processing or both.

These were normal fluent readers. What about dyslexic subjects?

07 WordsComparison.

Here, fluent readers are shown on top and dyslexic subjects below. There were two areas, indicated by these rectangles, where fluent readers showed activity but dyslexic subjects did not. The earliest difference was found in the LEFT inferior occipitotemporal cortex, where fluent readers had a strong transient activation at about 150 ms after word onset but the dyslexic subjects did not. No differences were seen in the corresponding area in the right hemisphere. Moreover, the subsequent left temporal activation in the fluent readers was not evident in the impaired readers. Instead, dyslexic subjects showed signals in the left inferior frontal cortex, approximately in Broca's area in this same time interval.

The areas and time windows indicated by the white rectangles may thus be critical for fluent reading. But what are the functional roles of these responses?

08 TheoryVisual.

Although there are various views of the exact subprocesses needed in reading, and of their relative dependence, there seems to be general agreement that the visual features must be processed first and then the analysis can proceed to the content, apparently first at the level of letters and then as whole words, which further activate the semantic and phonological representations of the words. The first experiment suggested that dyslexic readers have problems within the first 200 ms after seeing a word. Behavioural data imply that 200 ms is pretty much the borderline between pre-lexical and lexical processing. We therefore decided to take a closer look at the early visual, pre-lexical processes.

09 PerceptionStimuli.

We assessed the perceptual processes of reading by showing the subjects words, syllables, and single letters, imbedded in a noisy background, at four different noise levels. For control, the sequences also contained symbol strings. One sequence was composed of plain noise stimuli. The stimuli varied along two major dimensions: The amount of features to process increased with noise and with the number of items, whether letters or symbols. On the other hand, word-likeness was highest for complete words and low for symbols and noise.

10 PerceptionFunctional.

The data showed a clear dissociation between two processes, consistently across our fluent readers: Visual feature analysis occurred at about 100 ms after stimulus presentation, with the active areas around the occipital midline. This signal increased with increasing noise and with the number of items in the string, similarly for letters and symbols. At about 150 ms, the left inferior occipitotemporal cortex showed letter-string specific activation. It increased with the visibility of the letter strings. It was strongest for words, weaker for syllables, and still weaker for single letters. Crucially, the activation was significantly stronger for letter than symbol strings.

Interestingly enough, this area is the one which differentiated between fluent readers and dyslexic subjects in the single-word reading task.

11 PerceptionFeatureSpecific.

This slide compares the stage of visual feature analysis in fluent and dyslexic readers. The source areas in fluent readers are shown on the left and those in dyslexic subjects on the right. Responses reflecting visual feature analysis were evident in 9 out of 12 controls and in 7 out of 10 dyslexic subjects. The time behaviour of these activations is plotted in the middle, fluent readers on top and dyslexic subjects below. Red indicates words with no noise and green words at the highest noise level. Orange stands for symbol strings. This response peaked at about 100 ms after stimulus onset in both subject groups. The activation was strongest to noisy stimuli and smaller but equal to letter and symbol strings with no noise. The same pattern was evident both in fluent and dyslexic readers. There were no group differences in latencies, source locations, or activation strengths. Accordingly, this early processing stage seems to function normally in our dyslexic subjects.

12 PerceptionWordSpecific.

The activations reflecting the second processing stage, letter-string specific analysis, concentrated to the left inferior occipitotemporal border in 10 of the 12 fluent readers. However, only 2 of the 10 dyslexic subjects showed a response in this same region. The time behaviour of the activations is again plotted in the middle, fluent readers on top and the two dyslexic subjects below. In the fluent readers, the signal peaked at about 150 ms after word onset. Now, the response was strongest for words with no noise, red curve, and weaker for symbol strings, orange curve. The response was weaker and also delayed for the very noisy words, plotted with green. In the two dyslexic subjects, who showed a letter-string specific activation, the signals were overall much weaker than in the fluent readers. Also, the dissociation between letter and symbol strings was not nearly as clear as in the fluent readers. In the dyslexic subjects, reading thus appears to be disrupted at the level of letter-string specific processing, about 150 ms after word presentation.

13 TheorySemantic.

Here we have the flow chart again. The two preceding experiments indicated that differences between fluent and dyslexic readers start to emerge before semantic processing. In the third experiment, we focused on the lexico-semantic processing stage.

14 SentenceStimuli.

Ten fluent readers and 8 dyslexic subjects participated in this study. To identify cortical dynamics of reading comprehension, we employed the well-established N400 paradigm, where the subjects are shown sentences which create a very high expectation for a certain final word and one then plays with the appropriateness of that word in the sentence context. In our 400 sentences the final word was either expected, like in The piano was out of tune, rare but semantically possible, like in When the power went out the house became quiet, when it seems that most people would expect dark. Then we had two types of semantically wrong endings: The word could be totally anomalous, like in The pizza was too hot to sing, or something we call phonological, like in The gambler had a streak of bad luggage, instead of luck, where the first letters looked and sounded the same as those in the expected word. Naturally, we used Finnish sentences. The sentences were shown one word at a time and we averaged responses to the different types of final words.

15 Sentence122.

These are the responses to the different types of sentence-ending words in one fluent reader. Time 0 indicates word presentation. The displayed time interval is from 100 ms before word onset to 800 ms after it.

The visual responses were the same for all types of final words, but here over the left temporal area, the responses showed the typical N400 behaviour, reflecting semantic processing: The totally wrong endings, red and orange, resulted in a prominent deflection, peaking about 400 ms after word onset. The signal was much smaller and shorter-lasting for the rare but possible final words, plotted in green, and basically flat for the expected words.

16 SentenceModel.

And this is the source analysis in the same subject. The occipital areas were active first, one after the other, with identical responses to all word types. Thereafter, the left hemisphere became active, the posterior end of the sylvian fissure and the middle superior temporal cortex, showing strong dependence on word type. Activation of the right superior temporal cortex started about 25 ms later. The responses of all subjects were decomposed in the same way, individually, to identify those source areas which generated the N400-type semantic response.

17 SentenceLat.

The most prominent cluster of sources generating the N400 response was in the left superior temporal cortex in both groups.

These curves give the average N400 response patterns for the unexpected sentence-ending words in the left superior temporal cortex. Fluent readers are plotted on top and dyslexic subjects below. In both groups, the response was strongest for the wrong words, red and orange, and smaller and shorter-lasting for the rare but possible words, plotted in green. The expected words showed no activation exceeding the noise level.

In fluent readers, the responses to anomalous words and to the wrong words beginning with the correct letters were equally strong, suggesting that these subjects read a word as a whole and saw immediately if it was wrong. In dyslexic subjects, however, the responses to the 'phonological' words, that is, to semantically wrong words beginning with the correct-looking initial letters, this orange curve, was significantly weaker than to the anomalous words. This difference suggests either that the responses to the phonological word type were quite variable in latency or, which is more likely, that the dyslexics occasionally mistook the phonological word for the expected one, thus reducing the averaged response. In any case, it seems that the dyslexic subjects did not take a word in as a whole but rather advanced in smaller units.

The signals were overall smaller in dyslexic than control subjects, indicating involvement of a smaller of less synchronous neuronal population. Most importantly, the timing was significantly different. The semantic activation started at about 200 ms after word onset in controls but only at about 300 ms in the dyslexic subjects, indicating again that the presemantic stages were already affected in the dyslexic subjects.

This source area in the left temporal cortex and its time behaviour agree with the left temporal activation in the single-word reading task, which was seen in fluent readers but not in dyslexic subjects. Semantic processing is thus at least one of the processes reflected in the left temporal activation.

18 ReadingSummary.

This slide summarizes the cortical dynamics of silent reading in fluent and dyslexic readers, as revealed by our MEG studies. Results of fluent readers are plotted in light blue and those of dyslexic subjects in orange. Visual feature analysis occurred around the occipital midline, about 100 ms after word onset. This early analysis stage was indistinguishable between the two groups.

Thereafter, about 150 ms after word onset, the left inferior occipitotemporal cortex showed letter-string specific responses in fluent readers. This activation may well be the gateway from visual to linguistic analysis. The letter-string specific activation was not evident in dyslexic subjects. Recent fMRI studies of Shaywitz, Pugh and others have shown a very similar imbalance of activation between fluent and dyslexic readers.

In fluent readers, the left middle superior temporal cortex was involved in reading comprehension 200 to 600 ms after word onset, and this activation was seen also for isolated words. In dyslexic subjects, this response was entirely missing for isolated words but was evident for connected text, although weaker and delayed.

19 CorticalCorrelates.

In conclusion, a deficit in the stage of letter-string specific processing at about 150 ms after seeing a word is likely to be the immediate reason for the manifest difficulties with written words in dyslexic subjects. Is this in disagreement with the phonological interpretation of dyslexia? We don't think so. Cortical specificity to letter strings can only develop with experience. We first learn to listen to spoken words and only much later make the connection between the symbolic written words and the original phonological code. Impaired phonological processing could thus impede the formation of a specific visual word recognition unit which would automatically set letter strings apart from other objects and facilitate fast reading.

20 StutteringParadigm.

Now let us add the step of reading aloud. This study was performed in Germany, so the stimuli were common German nouns, composed of 7-8 letters. Each word was presented for 300 ms. Thereafter, there was a blank interval of 500 ms. A question mark then appeared for 2 seconds, prompting the subject to read the word aloud. There was again a blank period of 2 s before a new word was shown. We used this delayed reading paradigm to postpone muscle artefacts associated with mouth movement and to focus on the preparatory phases in speech production.

We recorded electromyography, EMG, across the opposite corners of the mouth and microphone signal to find the lip movement and speech onset latencies. The analysis is based both on evoked responses, phase-locked to word presentation and speech onset, and on event-related modulation of background rhythmic activity. The subjects were 9 adult developmental stutterers and 10 fluent speakers.

It is important to note that as the stimuli were isolated words, the stutterers read most of them as fluently as the controls. So, behaviourally the two groups did not differ from each other in this task. But let us see what happened in the brain.

21 StutteringSequence.

This slide illustrates the average cortical activation sequence in our 10 fluent speakers. The dots show the centres of the active cortical patches collected from the different subjects, and the curves the average time behaviour of activation in these areas. The first vertical line indicates the word presentation and the second vertical line the appearance of the question mark.

The occipital and the left and right inferior occipitotemporal cortices were active within the first 200 ms. The early response close to the occipital midline is likely to reflect the visual feature analysis, which is followed by letter-string specific processing in the left inferior occipitotemporal cortex. The largely visual nature of the responses around the occipital midline and in the right inferior occipitotemporal cortex is emphasized by the second response to the question mark.

Activation in the left superior temporal and inferior parietal cortices, starting about 200 to 300 ms after word onset, and reaching the maximum at about 400 ms, a typical N400 response, probably reflects semantic processing, as we saw earlier for silent reading. Activation of the left inferior frontal cortex, approximately Broca's area, however, was not seen during silent reading in our Finnish-speaking subjects. The activity started at about 200 ms after word onset, and is likely to reflect access to phonological representation of the word for articulation. Activation of this region seemed to be rather specific to vocalized reading. It could, in principle, also reflect differences between German- and Finnish-speaking subjects but that remains to be tested.

All these responses in the two left-most columns die out before the vocalization prompt. The signals depicted in the right-most column begin at about 200 to 300 ms after word onset and persist until actual vocalization and even beyond it. This makes a lot of sense as they arise from the left and right sensorimotor and premotor cortices and apparently from the supplementary motor area.

22 StutteringComparison.

This slide illustrates the differences between stutterers, plotted in orange, and fluent speakers, plotted in light blue. The earliest difference was detected in the left sensorimotor/premotor area, where stutterers showed activation already at 100 to 200 ms after word onset but fluent speakers did not. Later, at 200 to 300 ms, the left inferior frontal cortex started to be involved in fluent speakers but not yet in stutterers. During actual vocalization, fluent speakers showed strong activation of the right motor cortex, but stutterers did not. These three areas and these time windows may thus be particularly relevant for fluent speech.

Before interpretation of these findings, let us collect some more evidence from the background rhythmic activity, which also provides important information.

23 CorticalRhythms.

At rest, cortical neurons show spontaneous rhythmic activity. Here is an 8-s interval of MEG signals recorded over the left and right sensorimotor cortex and over the posterior visual areas. The subject was sitting relaxed, with his eyes closed. The spectra show the typical frequency distribution in a healthy adult subject. The parieto-occipital activity is mainly in the 10-Hz range, called alpha rhythm, whereas the sensorimotor activity has both 10- and 20-Hz components, and is known as mu rhythm. The sensorimotor rhythm is suppressed by moving the left or right hand, as shown by the white curve, and the posterior rhythm by opening the eyes, when these areas become involved in task performance.

The 20-Hz rhythm of the motor cortex is particularly interesting and useful as it shows somatotopic organization: Hand movements are associated by modulation of 20-Hz activity around the hand area, toe movements by modulation close to the vertex, and mouth movements around the more lateral face representation. And this is the what we are going to make use of.

24 Stutter20Hz_signals.

Now back to our word-reading task. Here we have the whole-head view of one subject, showing the mean amplitude of 20-Hz oscillations from 1 second before word onset to 5 seconds after it. There is a clear local suppression of 20-Hz activity, which is generally taken to indicate that the cortex is doing heavy task-related computation. The suppression appears to concentrate approximately over the lateral mouth areas but also more medially, over the hand area, with different time behaviours.

25 Stutter20Hz_sourceTSE.

The motor cortical 20-Hz activity was generated in the hand and mouth areas of the right and left hemispheres, as is shown here on one subject's MRI.

On the right, we have the mean modulation of 20-Hz activity in the hand and mouth areas, averaged over our 10 fluent speakers. The first vertical line indicates word presentation and the second line the vocalization prompt. In the mouth areas, plotted in red, the 20-Hz rhythm was strongly suppressed as soon as the word was shown, well before vocalization, whereas the weak hand area suppression started later, with the vocalization prompt.

Note that the cortical rhythms are modulated over periods of several seconds, whereas the phase-locked evoked responses, which we looked at first, concentrate within about one second from the trigger. Now we can estimate the amount of suppression of 20-Hz activity with respect to baselevel in each subject and...

26 StutteringTSEComparison.

...this slide compares the amount of 20-Hz suppression in fluent speakers, plotted in light blue, and stutterers, plotted in orange. The coloured bars indicate suppression in the mouth area and black bars suppression in the hand area. For each group, the left-hemisphere data is on the left and the right-hemisphere data on the right. The hemispheric balance was different in the two groups: In fluent speakers, the suppression was stronger, and also earlier, in the left hemisphere but in stutterers in the right hemisphere. Furthermore, in fluent speakers, the suppression was significantly more pronounced in the mouth than hand area, as one would expect for mouth movements. In stutterers, however, also the hand areas were markedly involved in speech production, with no significant difference between hand and mouth area suppressions.

27 StutteringLeftFrontal.

So, to summarize, stutterers were mainly fluent in this simple reading task, and yet their cortical dynamics differed from that of control subjects. During the first 400 ms after seeing the word, while preparing for vocalization, fluent speakers first activated Broca's area and then the left motor cortex. However, the sequence was reversed in stutterers, as if they were initiating motor programs before they knew how to say the word.

28 StutteringRightMotor.

During actual speech production, the right motor cortex, in particular, seemed to behave abnormally in our stutterers. The stutterers did not show clear phase-locked evoked responses in the mouth sensorimotor cortex. Still, they had very strong 20-Hz suppression in this area. This suggests that the neuronal populations were heavily involved in task performance but the functional connectivity within the area was not precise enough to generate salient phase-locked responses.

Moreover, 20-Hz activity was strongly suppressed both in the hand and mouth areas. There was no clear segregation, unlike in fluent speakers, suggesting incomplete specialization between the adjacent motor representations in these stutterers.

Interestingly, the extensive and prominent 20-Hz suppression in the right motor cortex goes rather nicely with the existing PET data which have shown exceptionally strong blood flow in the right frontal cortex of stutterers.

29 StutteringConcl.

So, even when stutterers were apparently fluent, their brain activity differed from that of fluent speakers. Both hemispheres were affected. There seemed to be abnormalities in an entire network activated specifically in speech production, and not just reading, including articulatory and motor preparation in the left hemisphere, and possibly grammatical and affective prosody, which have been suggested to be controlled by left inferior frontal and right motor/premotor cortices.

The next steps for us will be to eventually determine the exact functional roles of these responses, which are likely to be particularly relevant for fluent speech, and see if these same activation patterns play a role in overt stuttering.

30 HHDescr.

In the last part of my talk, I would like to describe one case study of an aphasic subject, were we made use of the studies of reading in healthy subjects. HH is a 43-year old man, a former construction worker, who had a haemorrhage and brain infarction in 1985. He has a huge lesion in the left hemisphere, extending from the parietal to medial frontal areas. HH shows agrammatic Broca's aphasia. Detailed analysis of reading skills has revealed the symptom complex of deep dyslexia. Subject HH can usually read nominative word forms but he has serious difficulties with inflected forms which are an essential part of Finnish language.

We had two main questions: Deep dyslexia has been suggested to reflect shift of language processing to the right hemisphere. Now we wanted to check which hemisphere was mainly involved in language comprehension in HH. Secondly, we hoped to characterize

cortical correlates of his problems with case-inflected forms.

31 HHTasks.

First, HH was tested with the sentence reading task I showed earlier to see if his N400 responses, reflecting semantic processing, agreed with those in normal subjects.

Second, HH was shown nominative and case-inflected forms in a randomized sequence. In Finnish, words have a huge amount of case-inflected forms. The Finnish word leiri means a camp. Instead of adding prepositions, like in a camp, the ending -ssä is added to the word, so we get leirissä. The word was shown for 300 ms, and a question mark appeared 1 second later, prompting him to read the word aloud.

32 HHSpont.

Subjects with lesions often have much more unstable cortical activity than healthy subjects, which complicates the analysis. This slide shows 6 seconds of HH's spontaneous cortical activity. While activity over the right hemisphere looks rather normal, HH has very strong slow oscillations, at about 1 Hz, over the posterior left hemisphere. These signals generate lots of annoying disturbances in the stimulus-locked responses.

In a healthy, waking subject, one never sees oscillations below about 6 Hz. But in HH, one can even localize the cortical sources of these slow oscillations. The signals originate mostly in the posterior wall of the lesioned area.

To find the source areas of the phase-locked evoked response properly we first have to remove the field pattern of these slow-frequency oscillations from the signals.

33 HHSentence.

In sentence reading, the cortical responses were again averaged with respect the four different types of sentence-ending words. We found activation of the left and right superior temporal cortices. Like in healthy subjects, the strongest reponse to semantically inappropriate final words was seen in the left superior temporal cortex, peaking at about 400 ms. The responses are messy, largely because HH really had difficulties with reading. Still, there is no question about the significantly stronger response to anomalous than expected sentence endings.

34 HHWords.

This slide shows what happens when HH has seen the question mark and tries to say the word aloud. The responses to stimuli in the nominative form are plotted in light blue and responses to case-inflected forms in orange. The two superimposed curves of the same colour are responses averaged with respect to even-numbered and odd-numbered stimuli. The responses are reliable only when the two subaverages coincide.

The occipital cortex is active first. The subsequent response in the left frontal cortex, source 3, is rather interesting, because it is nicely reproducible for the easy nominative form but quite variable for the difficult case-inflected form. Thereafter, activity spreads to the right and left temporal areas.

Most interestingly, the area generating the disturbing low-frequency activity in the posterior parietal cortex shows a reproducible, strong response when HH tries to read aloud a case-inflected word, which he rarely manages to do.

So, to summarize, based on the sentence-reading task, lexico-semantic processing was apparently still subserved by the damaged left hemisphere like in normal subjects. The morphological complexity of the words had a strong effect on speech production, involving the damaged cortex which showed pathological low-frequency rhythms.