Home - Emerging Tech - New Machine Learning Algorithm Performs Medical Image Registration 1,000 Times Faster
Research News

New Machine Learning Algorithm Performs Medical Image Registration 1,000 Times Faster

Faster MIT researchers create a new algorithm for MRI scan registration that uses a convolutional neural network to learn from each alignment

By
Meeri Kim, Contributor
Mon, 07/16/2018

SHARE: 

When a patient receives MRI or CT scans before and after a given treatment, physicians need a way to accurately compare the two sets of images. Due to differences in body orientation and position in these scans, a process known as image registration is needed to align the images using the same coordinate system for a composite view of the patient.

Image registration for medical imaging can often take two hours or more, as computer algorithms try to match up to a million pixels in the scans. Researchers from the Massachusetts Institute of Technology have developed a new algorithm for medical image registration that employs machine learning and runs thousands of times faster than the current method, while being just as accurate. The results were presented at the Conference on Computer Vision and Pattern Recognition (CVPR) held in Salt Lake City in June.

“Traditional algorithms for medical image registration are prohibitively slow, making it unlikely that they will be used in many clinical settings,” said first author Guha Balakrishnan, a graduate student in MIT’s Computer Science and Artificial Intelligence Laboratory. “With a recent increase in available medical image data, this runtime is also slowing down clinical research.”

Balakrishnan and his colleagues worked with clinicians at the Massachusetts General Hospital on a project that required intensive processing of thousands of MRI scans in order to gain insight into cerebrovascular disease. As a way to streamline analysis, they created a machine learning algorithm called VoxelMorph that would take advantage of the large data availability and yield very fast image registration.

Traditional image registration algorithms dismiss all data after each alignment and never learn over time. VoxelMorph, on the other hand, is powered by a convolutional neural network that constantly learns from processing voxels and optimizes parameters for future scans. The researchers initially trained the algorithm with 7,000 publicly available MRI scans and then tested it on 250 novel images.

VoxelMorph demonstrated registration accuracy comparable to state-of-the-art image registration techniques, while completing the tasks over a thousand times faster. The significantly shortened runtime can dramatically impact analysis, and potentially assist at the point of care for clinicians.

“It is often useful to align medical images of a particular patient taken before and after a surgery or treatment, to assess the effect of the procedure,” said Balakrishnan. “Whereas previous methods were prohibitively slow, the short VoxelMorph runtime promises to enable this comparison as soon as the scan is acquired. This is a direction of future work for us.”

Jorge Cardoso, a senior lecturer in the Department of Imaging and Biomedical Engineering at King's College London, finds the results intriguing and expects VoxelMorph to have a positive impact on clinical applications. He notes that the current work builds on earlier research by several other groups, where image registration is posed as a multidimensional regression problem, but still requires additional developments in the field before being put into practice.

“It’s a very interesting paper, with great results, and that can significantly speed up the process of image registration,” said Cardoso. “The impact of this class of methods [which includes VoxelMorph] will be significant mostly in the domain of image-guided interventions, as it is the primary area of application where speed is paramount, but several other scientific and technical developments will be necessary to allow these methods to be used accurately and robustly in clinical applications.”