Estimating 3-dimensional liver motion using deep learning and 2-dimensional ultrasound images. 2020

Shiho Yagasaki, and Norihiro Koizumi, and Yu Nishiyama, and Ryosuke Kondo, and Tsubasa Imaizumi, and Naoki Matsumoto, and Masahiro Ogawa, and Kazushi Numata
The University of Electro-Communications, Chofu, Japan.

OBJECTIVE The main purpose of this study is to construct a system to track the tumor position during radiofrequency ablation (RFA) treatment. Existing tumor tracking systems are designed to track a tumor in a two-dimensional (2D) ultrasound (US) image. As a result, the three-dimensional (3D) motion of the organs cannot be accommodated and the ablation area may be lost. In this study, we propose a method for estimating the 3D movement of the liver as a preliminary system for tumor tracking. Additionally, in current 3D movement estimation systems, the motion of different structures during RFA could reduce the tumor visibility in US images. Therefore, we also aim to improve the estimation of the 3D movement of the liver by improving the liver segmentation. We propose a novel approach to estimate the relative 6-axial movement (x, y, z, roll, pitch, and yaw) between the liver and the US probe in order to estimate the overall movement of the liver. METHODS We used a convolutional neural network (CNN) to estimate the 3D displacement from two-dimensional US images. In addition, to improve the accuracy of the estimation, we introduced a segmentation map of the liver region as the input for the regression network. Specifically, we improved the extraction accuracy of the liver region by using a bi-directional convolutional LSTM U-Net with densely connected convolutions (BCDU-Net). RESULTS By using BCDU-Net, the accuracy of the segmentation was dramatically improved, and as a result, the accuracy of the movement estimation was also improved. The mean absolute error for the out-of-plane direction was 0.0645 mm/frame. CONCLUSIONS The experimental results show the effectiveness of our novel method to identify the movement of the liver by BCDU-Net and CNN. Precise segmentation of the liver by BCDU-Net also contributes to enhancing the performance of the liver movement estimation.

UI MeSH Term Description Entries
D007091 Image Processing, Computer-Assisted A technique of inputting two-dimensional or three-dimensional images into a computer and then enhancing or analyzing the imagery into a form that is more useful to the human observer. Biomedical Image Processing,Computer-Assisted Image Processing,Digital Image Processing,Image Analysis, Computer-Assisted,Image Reconstruction,Medical Image Processing,Analysis, Computer-Assisted Image,Computer-Assisted Image Analysis,Computer Assisted Image Analysis,Computer Assisted Image Processing,Computer-Assisted Image Analyses,Image Analyses, Computer-Assisted,Image Analysis, Computer Assisted,Image Processing, Biomedical,Image Processing, Computer Assisted,Image Processing, Digital,Image Processing, Medical,Image Processings, Medical,Image Reconstructions,Medical Image Processings,Processing, Biomedical Image,Processing, Digital Image,Processing, Medical Image,Processings, Digital Image,Processings, Medical Image,Reconstruction, Image,Reconstructions, Image
D008099 Liver A large lobed glandular organ in the abdomen of vertebrates that is responsible for detoxification, metabolism, synthesis and storage of various substances. Livers
D008113 Liver Neoplasms Tumors or cancer of the LIVER. Cancer of Liver,Hepatic Cancer,Liver Cancer,Cancer of the Liver,Cancer, Hepatocellular,Hepatic Neoplasms,Hepatocellular Cancer,Neoplasms, Hepatic,Neoplasms, Liver,Cancer, Hepatic,Cancer, Liver,Cancers, Hepatic,Cancers, Hepatocellular,Cancers, Liver,Hepatic Cancers,Hepatic Neoplasm,Hepatocellular Cancers,Liver Cancers,Liver Neoplasm,Neoplasm, Hepatic,Neoplasm, Liver
D006801 Humans Members of the species Homo sapiens. Homo sapiens,Man (Taxonomy),Human,Man, Modern,Modern Man
D000075530 Organ Motion Movement of internal organs due to physiological processes.
D000077321 Deep Learning Supervised or unsupervised machine learning methods that use multiple layers of data representations generated by nonlinear transformations, instead of individual task-specific ALGORITHMS, to build and train neural network models. Hierarchical Learning,Learning, Deep,Learning, Hierarchical
D000078703 Radiofrequency Ablation Removal of tissue using heat generated from electrodes delivering an alternating electrical current in the frequency of RADIO WAVES. Radio Frequency Ablation,Radio-Frequency Ablation,Ablation, Radio Frequency,Ablation, Radio-Frequency,Ablation, Radiofrequency
D014463 Ultrasonography The visualization of deep structures of the body by recording the reflections or echoes of ultrasonic pulses directed into the tissues. Use of ultrasound for imaging or diagnostic purposes employs frequencies ranging from 1.6 to 10 megahertz. Echography,Echotomography,Echotomography, Computer,Sonography, Medical,Tomography, Ultrasonic,Ultrasonic Diagnosis,Ultrasonic Imaging,Ultrasonographic Imaging,Computer Echotomography,Diagnosis, Ultrasonic,Diagnostic Ultrasound,Ultrasonic Tomography,Ultrasound Imaging,Diagnoses, Ultrasonic,Diagnostic Ultrasounds,Imaging, Ultrasonic,Imaging, Ultrasonographic,Imaging, Ultrasound,Imagings, Ultrasonographic,Imagings, Ultrasound,Medical Sonography,Ultrasonic Diagnoses,Ultrasonographic Imagings,Ultrasound, Diagnostic,Ultrasounds, Diagnostic
D016571 Neural Networks, Computer A computer architecture, implementable in either hardware or software, modeled after biological neural networks. Like the biological system in which the processing capability is a result of the interconnection strengths between arrays of nonlinear processing nodes, computerized neural networks, often called perceptrons or multilayer connectionist models, consist of neuron-like units. A homogeneous group of units makes up a layer. These networks are good at pattern recognition. They are adaptive, performing tasks by example, and thus are better for decision-making than are linear learning machines or cluster analysis. They do not require explicit programming. Computational Neural Networks,Connectionist Models,Models, Neural Network,Neural Network Models,Neural Networks (Computer),Perceptrons,Computational Neural Network,Computer Neural Network,Computer Neural Networks,Connectionist Model,Model, Connectionist,Model, Neural Network,Models, Connectionist,Network Model, Neural,Network Models, Neural,Network, Computational Neural,Network, Computer Neural,Network, Neural (Computer),Networks, Computational Neural,Networks, Computer Neural,Networks, Neural (Computer),Neural Network (Computer),Neural Network Model,Neural Network, Computational,Neural Network, Computer,Neural Networks, Computational,Perceptron

Related Publications

Shiho Yagasaki, and Norihiro Koizumi, and Yu Nishiyama, and Ryosuke Kondo, and Tsubasa Imaizumi, and Naoki Matsumoto, and Masahiro Ogawa, and Kazushi Numata
December 2021, Medical physics,
Shiho Yagasaki, and Norihiro Koizumi, and Yu Nishiyama, and Ryosuke Kondo, and Tsubasa Imaizumi, and Naoki Matsumoto, and Masahiro Ogawa, and Kazushi Numata
December 2022, IEEE transactions on ultrasonics, ferroelectrics, and frequency control,
Shiho Yagasaki, and Norihiro Koizumi, and Yu Nishiyama, and Ryosuke Kondo, and Tsubasa Imaizumi, and Naoki Matsumoto, and Masahiro Ogawa, and Kazushi Numata
August 2021, Sensors (Basel, Switzerland),
Shiho Yagasaki, and Norihiro Koizumi, and Yu Nishiyama, and Ryosuke Kondo, and Tsubasa Imaizumi, and Naoki Matsumoto, and Masahiro Ogawa, and Kazushi Numata
October 2020, Computer methods and programs in biomedicine,
Shiho Yagasaki, and Norihiro Koizumi, and Yu Nishiyama, and Ryosuke Kondo, and Tsubasa Imaizumi, and Naoki Matsumoto, and Masahiro Ogawa, and Kazushi Numata
July 2021, Scientific reports,
Shiho Yagasaki, and Norihiro Koizumi, and Yu Nishiyama, and Ryosuke Kondo, and Tsubasa Imaizumi, and Naoki Matsumoto, and Masahiro Ogawa, and Kazushi Numata
October 2020, Medical physics,
Shiho Yagasaki, and Norihiro Koizumi, and Yu Nishiyama, and Ryosuke Kondo, and Tsubasa Imaizumi, and Naoki Matsumoto, and Masahiro Ogawa, and Kazushi Numata
October 2019, Magnetic resonance in medicine,
Shiho Yagasaki, and Norihiro Koizumi, and Yu Nishiyama, and Ryosuke Kondo, and Tsubasa Imaizumi, and Naoki Matsumoto, and Masahiro Ogawa, and Kazushi Numata
February 2023, Photoacoustics,
Shiho Yagasaki, and Norihiro Koizumi, and Yu Nishiyama, and Ryosuke Kondo, and Tsubasa Imaizumi, and Naoki Matsumoto, and Masahiro Ogawa, and Kazushi Numata
September 2022, Osteoarthritis and cartilage open,
Shiho Yagasaki, and Norihiro Koizumi, and Yu Nishiyama, and Ryosuke Kondo, and Tsubasa Imaizumi, and Naoki Matsumoto, and Masahiro Ogawa, and Kazushi Numata
January 2024, Scientific reports,
Copied contents to your clipboard!