Machine Learning Streamlines Neuroimaging Data Analysis | Spectrum

Expressway: A new technique maps image processing pipelines likely to yield accurate results (red) against those that are less promising (blue).

A new computer-guided method can direct researchers to the most promising techniques for processing their neuroimaging data. The approach could help solve the problem of low reproducibility in autism brain imaging studies, according to a new paper.

So-called “pipelines” for analyzing brain imaging data can differ in how they remove noise, divide the brain into sections, and define connectivity, among other variables. “A lot of these design analysis decisions can actually change the results in all sorts of ways. So we end up with this really fragmented literature,” says lead researcher Robert Leech, professor of neuroimaging analysis at King’s College of London in the UK.

But when researchers collect new brain images for analysis, it’s not always immediately clear which pipeline is most appropriate. Some researchers stick to the same pipeline no matter what. Others opt for many different pipelines, but this approach requires a lot of computational power, can lead to statistical issues by analyzing the same data multiple times, and can yield results that are “over-fit” to a small amount of data. and do not generalize.

Leech and his colleagues developed a two-step computational process to plot the way out of this quagmire.

In the first step, for a given set of brain images, a well-established algorithm creates a map of all the different analysis pipelines that could be used. Pipelines that tend to produce similar results are grouped together on the map, regardless of the techniques they employ.

Next, the machine learning tests the performance of a few pipelines from different regions of the map on a subset of data to focus on the most efficient ones. Based on this sampling, the algorithm estimates the performance of other nearby pipelines and identifies the most accurate clusters.

“You need a dataset and it searches through many possible analyzes in a small section of it, and then it can take the consequence of that analysis to inform an analysis of the larger” portion of the data, says Jonathan Smallwood, member of the study team, professor of psychology at Queen’s University in Kingston, Ontario, Canada. “This approach is very smart because it just means you never have to compromise.”

The book was published in June in Nature Communicationand the researchers made the code freely available online.

DDepending on the goal of a study, researchers can use the method either exploratory, to deeply map pipeline possibilities, or actionable, to identify the best performing pipeline with the fewest possible samples. The basic approach could be used to answer a variety of different functional and structural imaging questions, the researchers say.

The method is an “elegant” solution to the problem of choosing a treatment pipeline, says Annika Linke, assistant research professor at the Brain Development Imaging Laboratories at San Diego State University in California, who has no participated in the study. “It is a principle-based and transparent approach to identifying how different analysis choices relate and impact results that, given the openly shared code, could easily be applied to future studies. “

To demonstrate its capabilities, Leech and his team analyzed two existing functional connectivity datasets.

In an analysis of 520 scans of 298 people between the ages of 14 and 26, the researchers used their method to sift through 544 different pipelines and identify which best predicted a person’s age. In another analysis, they fed data from 406 autistic and 476 neurotypical people from the Autism Brain Imaging Data Exchange through 384 different analysis pipelines to identify which best distinguished the two groups.

The purpose of the exercise was to show how the method works and how it can be used, not to identify the best pipeline in each case. (There is no such thing, say the researchers.)

Yet the analysis revealed some intriguing patterns. For the autism dataset, for example, the way a pipeline split the brain into smaller regions didn’t significantly affect the results, but the definitions of connectivity did.

There are still many reasons – such as age differences between cohorts, heterogeneity in autism, etc. — for which autism brain imaging studies could yield disparate results that cannot be addressed by this method, Linke warns. But, she says, the approach is promising enough that she’s considering applying it to one of her lab’s datasets.

Cite this article: https://doi.org/10.53053/VYIR9955