Supplemental web site in support of the paper:

Predicting Human Brain Activity Associated with the Meanings of Nouns,
Tom M. Mitchell, Svetlana V. Shinkareva, Andrew Carlson, Kai-Min Chang, Vicente L. Malave, Robert A. Mason, Marcel Adam Just,
Science, May 30, 2008.

Home Semantic Features Feature Signatures (P1) Feature Signatures (average) Data

Science 2008 fMRI data

This page contains data, software and documentation on the fMRI data set for the Science 2008 data. This data was originally collected by Marcel Just and his colleagues in Carnegie Mellon University's CCBI.

The Experiment:

Participants: Nine right-handed adults (5 female, age between 18 and 32) from the Carnegie Mellon community participated and gave informed consent approved by the University of Pittsburgh and Carnegie Mellon Institutional Review Boards. Two additional participants were excluded from the analysis due to head motion greater than 2.5 mm.

Experimental paradigm: The stimuli were line drawings and noun labels of 60 concrete objects from 12 semantic categories with 5 exemplars per category. Most of the line drawings were taken or adapted from the Snodgrass and Vanderwart (1980) set and others were added using a similar drawing style.

To ensure that each participant had a consistent set of properties to think about, they were asked to generate and write a set of properties for each exemplar in a separate session prior to the scanning session (such as "cold, knights, stone" for castle). However, nothing was done to elicit consistency across participants.

The entire set of 60 stimuli was presented 6 times during the scanning session, in a different random order each time. Participants silently viewed the stimuli and were asked to think of the same item properties consistently across the 6 presentations. Each stimulus was presented for 3s, followed by a 7s rest period, during which the participants were instructed to fixate on an X displayed in the center of the screen. There were two additional presentations of the fixation, 31s each, at the beginning and at the end of each session, to provide a baseline measure of activity.

Data acquisition: Functional images were acquired on a Siemens Allegra 3.0T scanner (Siemens, Erlangen, Germany) at the Brain Imaging Research Center of Carnegie Mellon University and the University of Pittsburgh using a gradient echo EPI pulse sequence with TR = 1000 ms, TE = 30 ms and a 60° angle. Seventeen 5-mm thick oblique-axial slices were imaged with a gap of 1-mm between slices. The acquisition matrix was 64 x 64 with 3.125 x 3.125 x 5-mm voxels.

Data processing and analysis: Data processing and statistical analysis were performed with Statistical Parametric Mapping software (SPM99, Wellcome Department of Cognitive Neurology, London, UK). The data were corrected for slice timing, motion, linear trend, and were temporally smoothed with a high-pass filter using 190s cutoff. The data were normalized to the MNI template brain image using 12-parameter affine transformation. The data were further spatially normalized into MNI space and resampled to 3x3x6 mm3 voxels. The percent signal change (PSC) relative to the fixation condition was computed for each object presentation at each voxel. The mean of the four images (mean PSC) acquired within a 4s window, offset 4s from the stimulus onset (to account for the delay in hemodynamic response) provided the main input measure.

The Data:

The data consists of .mat files that can be directly loaded into Matlab version 6 or higher. The data file for each human subject is approximately 100Mbytes, so be sure you are on a high-speed line before attempting to download these. You might want to begin by downloading just the data for one of these subjects.

  1. data for subject P1
  2. data for subject P2
  3. data for subject P3
  4. data for subject P4
  5. data for subject P5
  6. data for subject P6
  7. data for subject P7
  8. data for subject P8
  9. data for subject P9

The Documentation:

The documentation includes a description of the data structures used to represent the data and a set of functions used to convert voxel coordinates into MNI coordinates and AAL labels. For a demonstration of how to load the data and convert voxel coordinates, please see the provided demo.m.

Analyses of this data are described in:

Questions or comments: contact Tom Mitchell or Kai-min Kevin Chang
Last modified on Mon Sep 21 10:27:35 EDT 2009 by Kai-min