The "Narratives" fMRI dataset for evaluating models of naturalistic language comprehension. 2021

Samuel A Nastase, and Yun-Fei Liu, and Hanna Hillman, and Asieh Zadbood, and Liat Hasenfratz, and Neggin Keshavarzian, and Janice Chen, and Christopher J Honey, and Yaara Yeshurun, and Mor Regev, and Mai Nguyen, and Claire H C Chang, and Christopher Baldassano, and Olga Lositsky, and Erez Simony, and Michael A Chow, and Yuan Chang Leong, and Paula P Brooks, and Emily Micciche, and Gina Choe, and Ariel Goldstein, and Tamara Vanderwal, and Yaroslav O Halchenko, and Kenneth A Norman, and Uri Hasson
Princeton Neuroscience Institute and Department of Psychology, Princeton University, Princeton, NJ, USA. sam.nastase@gmail.com.

The "Narratives" collection aggregates a variety of functional MRI datasets collected while human subjects listened to naturalistic spoken stories. The current release includes 345 subjects, 891 functional scans, and 27 diverse stories of varying duration totaling ~4.6 hours of unique stimuli (~43,000 words). This data collection is well-suited for naturalistic neuroimaging analysis, and is intended to serve as a benchmark for models of language and narrative comprehension. We provide standardized MRI data accompanied by rich metadata, preprocessed versions of the data ready for immediate use, and the spoken story stimuli with time-stamped phoneme- and word-level transcripts. All code and data are publicly available with full provenance in keeping with current best practices in transparent and reproducible neuroimaging.

UI MeSH Term Description Entries
D007802 Language A verbal or nonverbal means of communicating ideas or feelings. Dialect,Dialects,Languages
D008279 Magnetic Resonance Imaging Non-invasive method of demonstrating internal anatomy based on the principle that atomic nuclei in a strong magnetic field absorb pulses of radiofrequency energy and emit them as radiowaves which can be reconstructed into computerized images. The concept includes proton spin tomographic techniques. Chemical Shift Imaging,MR Tomography,MRI Scans,MRI, Functional,Magnetic Resonance Image,Magnetic Resonance Imaging, Functional,Magnetization Transfer Contrast Imaging,NMR Imaging,NMR Tomography,Tomography, NMR,Tomography, Proton Spin,fMRI,Functional Magnetic Resonance Imaging,Imaging, Chemical Shift,Proton Spin Tomography,Spin Echo Imaging,Steady-State Free Precession MRI,Tomography, MR,Zeugmatography,Chemical Shift Imagings,Echo Imaging, Spin,Echo Imagings, Spin,Functional MRI,Functional MRIs,Image, Magnetic Resonance,Imaging, Magnetic Resonance,Imaging, NMR,Imaging, Spin Echo,Imagings, Chemical Shift,Imagings, Spin Echo,MRI Scan,MRIs, Functional,Magnetic Resonance Images,Resonance Image, Magnetic,Scan, MRI,Scans, MRI,Shift Imaging, Chemical,Shift Imagings, Chemical,Spin Echo Imagings,Steady State Free Precession MRI
D008297 Male Males
D008875 Middle Aged An adult aged 45 - 64 years. Middle Age
D001931 Brain Mapping Imaging techniques used to colocalize sites of brain functions or physiological activity with brain structures. Brain Electrical Activity Mapping,Functional Cerebral Localization,Topographic Brain Mapping,Brain Mapping, Topographic,Functional Cerebral Localizations,Mapping, Brain,Mapping, Topographic Brain
D005260 Female Females
D006801 Humans Members of the species Homo sapiens. Homo sapiens,Man (Taxonomy),Human,Man, Modern,Modern Man
D000293 Adolescent A person 13 to 18 years of age. Adolescence,Youth,Adolescents,Adolescents, Female,Adolescents, Male,Teenagers,Teens,Adolescent, Female,Adolescent, Male,Female Adolescent,Female Adolescents,Male Adolescent,Male Adolescents,Teen,Teenager,Youths
D000328 Adult A person having attained full growth or maturity. Adults are of 19 through 44 years of age. For a person between 19 and 24 years of age, YOUNG ADULT is available. Adults
D001330 Electronic Data Processing Applications that store and process large quantities of data. Automatic Data Processing,Bar Codes,Computer Data Processing,Data Processing, Automatic,Information Processing, Automatic,Optical Readers,Information Processing,Automatic Information Processing,Bar Code,Codes, Bar,Data Processing, Computer,Data Processing, Electronic,Optical Reader,Processing, Automatic Data,Processing, Automatic Information,Processing, Computer Data,Processing, Electronic Data,Processing, Information

Related Publications

Samuel A Nastase, and Yun-Fei Liu, and Hanna Hillman, and Asieh Zadbood, and Liat Hasenfratz, and Neggin Keshavarzian, and Janice Chen, and Christopher J Honey, and Yaara Yeshurun, and Mor Regev, and Mai Nguyen, and Claire H C Chang, and Christopher Baldassano, and Olga Lositsky, and Erez Simony, and Michael A Chow, and Yuan Chang Leong, and Paula P Brooks, and Emily Micciche, and Gina Choe, and Ariel Goldstein, and Tamara Vanderwal, and Yaroslav O Halchenko, and Kenneth A Norman, and Uri Hasson
February 2020, Neuropsychologia,
Samuel A Nastase, and Yun-Fei Liu, and Hanna Hillman, and Asieh Zadbood, and Liat Hasenfratz, and Neggin Keshavarzian, and Janice Chen, and Christopher J Honey, and Yaara Yeshurun, and Mor Regev, and Mai Nguyen, and Claire H C Chang, and Christopher Baldassano, and Olga Lositsky, and Erez Simony, and Michael A Chow, and Yuan Chang Leong, and Paula P Brooks, and Emily Micciche, and Gina Choe, and Ariel Goldstein, and Tamara Vanderwal, and Yaroslav O Halchenko, and Kenneth A Norman, and Uri Hasson
August 2023, Scientific data,
Samuel A Nastase, and Yun-Fei Liu, and Hanna Hillman, and Asieh Zadbood, and Liat Hasenfratz, and Neggin Keshavarzian, and Janice Chen, and Christopher J Honey, and Yaara Yeshurun, and Mor Regev, and Mai Nguyen, and Claire H C Chang, and Christopher Baldassano, and Olga Lositsky, and Erez Simony, and Michael A Chow, and Yuan Chang Leong, and Paula P Brooks, and Emily Micciche, and Gina Choe, and Ariel Goldstein, and Tamara Vanderwal, and Yaroslav O Halchenko, and Kenneth A Norman, and Uri Hasson
September 2010, Brain research reviews,
Samuel A Nastase, and Yun-Fei Liu, and Hanna Hillman, and Asieh Zadbood, and Liat Hasenfratz, and Neggin Keshavarzian, and Janice Chen, and Christopher J Honey, and Yaara Yeshurun, and Mor Regev, and Mai Nguyen, and Claire H C Chang, and Christopher Baldassano, and Olga Lositsky, and Erez Simony, and Michael A Chow, and Yuan Chang Leong, and Paula P Brooks, and Emily Micciche, and Gina Choe, and Ariel Goldstein, and Tamara Vanderwal, and Yaroslav O Halchenko, and Kenneth A Norman, and Uri Hasson
January 2019, Frontiers in psychology,
Samuel A Nastase, and Yun-Fei Liu, and Hanna Hillman, and Asieh Zadbood, and Liat Hasenfratz, and Neggin Keshavarzian, and Janice Chen, and Christopher J Honey, and Yaara Yeshurun, and Mor Regev, and Mai Nguyen, and Claire H C Chang, and Christopher Baldassano, and Olga Lositsky, and Erez Simony, and Michael A Chow, and Yuan Chang Leong, and Paula P Brooks, and Emily Micciche, and Gina Choe, and Ariel Goldstein, and Tamara Vanderwal, and Yaroslav O Halchenko, and Kenneth A Norman, and Uri Hasson
May 2024, Scientific data,
Samuel A Nastase, and Yun-Fei Liu, and Hanna Hillman, and Asieh Zadbood, and Liat Hasenfratz, and Neggin Keshavarzian, and Janice Chen, and Christopher J Honey, and Yaara Yeshurun, and Mor Regev, and Mai Nguyen, and Claire H C Chang, and Christopher Baldassano, and Olga Lositsky, and Erez Simony, and Michael A Chow, and Yuan Chang Leong, and Paula P Brooks, and Emily Micciche, and Gina Choe, and Ariel Goldstein, and Tamara Vanderwal, and Yaroslav O Halchenko, and Kenneth A Norman, and Uri Hasson
January 2020, Topics in cognitive science,
Samuel A Nastase, and Yun-Fei Liu, and Hanna Hillman, and Asieh Zadbood, and Liat Hasenfratz, and Neggin Keshavarzian, and Janice Chen, and Christopher J Honey, and Yaara Yeshurun, and Mor Regev, and Mai Nguyen, and Claire H C Chang, and Christopher Baldassano, and Olga Lositsky, and Erez Simony, and Michael A Chow, and Yuan Chang Leong, and Paula P Brooks, and Emily Micciche, and Gina Choe, and Ariel Goldstein, and Tamara Vanderwal, and Yaroslav O Halchenko, and Kenneth A Norman, and Uri Hasson
August 2023, Scientific data,
Samuel A Nastase, and Yun-Fei Liu, and Hanna Hillman, and Asieh Zadbood, and Liat Hasenfratz, and Neggin Keshavarzian, and Janice Chen, and Christopher J Honey, and Yaara Yeshurun, and Mor Regev, and Mai Nguyen, and Claire H C Chang, and Christopher Baldassano, and Olga Lositsky, and Erez Simony, and Michael A Chow, and Yuan Chang Leong, and Paula P Brooks, and Emily Micciche, and Gina Choe, and Ariel Goldstein, and Tamara Vanderwal, and Yaroslav O Halchenko, and Kenneth A Norman, and Uri Hasson
June 2022, Scientific data,
Samuel A Nastase, and Yun-Fei Liu, and Hanna Hillman, and Asieh Zadbood, and Liat Hasenfratz, and Neggin Keshavarzian, and Janice Chen, and Christopher J Honey, and Yaara Yeshurun, and Mor Regev, and Mai Nguyen, and Claire H C Chang, and Christopher Baldassano, and Olga Lositsky, and Erez Simony, and Michael A Chow, and Yuan Chang Leong, and Paula P Brooks, and Emily Micciche, and Gina Choe, and Ariel Goldstein, and Tamara Vanderwal, and Yaroslav O Halchenko, and Kenneth A Norman, and Uri Hasson
January 1981, Journal of psycholinguistic research,
Samuel A Nastase, and Yun-Fei Liu, and Hanna Hillman, and Asieh Zadbood, and Liat Hasenfratz, and Neggin Keshavarzian, and Janice Chen, and Christopher J Honey, and Yaara Yeshurun, and Mor Regev, and Mai Nguyen, and Claire H C Chang, and Christopher Baldassano, and Olga Lositsky, and Erez Simony, and Michael A Chow, and Yuan Chang Leong, and Paula P Brooks, and Emily Micciche, and Gina Choe, and Ariel Goldstein, and Tamara Vanderwal, and Yaroslav O Halchenko, and Kenneth A Norman, and Uri Hasson
October 2011, Brain and language,
Copied contents to your clipboard!