Tu sei qui

How to conduct small sample studies in an era of Big Data?

15 aprile 2024
4:00 pm
San Francesco Complex - Sagrestia

2nd seminar

 

Significant concerns related to the replicability of neuroimaging findings are pushing the field towards ever larger sample sizes through data pooling efforts (e.g., ENIGMA https://enigma.ini.usc.edu/) or multi-site studies (e.g., Adolescent Brain Cognitive Development https://abcdstudy.org/, Healthy Brain and Child Development https://hbcdstudy.org/). Although many may welcome this trend, likening it to the changes that occurred in genetics research, it raises concerns for how individual researchers with limited resources should conduct their research and especially so if their focus is on clinical samples which typically can’t be recruited in their hundreds or thousands. This presentation will discuss a number of strategies to consider. One includes employing the neuroimaging protocol of the larger study (e.g., ABCD) and utilizing it for the new smaller study. Having the “normative” data of the larger study can benefit the newer study in many ways. Another approach employs “meta-matching,” wherein a classifier trained on the larger dataset can be usefully deployed in the smaller dataset even when the outcome measures differ between the two. Other strategies include maximizing the yield from the new smaller study with, for example, experimental manipulations that give larger effect sizes, employing task probes with improved reliability, obtaining deeper phenotyping and denser temporal sampling with repeated scans, optimizing data yield with adaptive task design algorithms, and improved data analysis that extracts more reliable and robust measures from the neuroimaging data.

 

Join at: imt.lu/sagrestia

relatore: 
Hugh Patrick Garavan, University of Vermont
Units: 
MOMILAB