×

News

Professor Sung-Hoon Kwon developed a ‘cancer network’ using AI technology for the development of next-generation cancer diagnostic indicators. (Munhwa Ilbo, 2023.01.04)

January 4, 2023l Hit 100



Science

- Research by Sung-Hoon Kwon, SNU Dept. of Electrical and Computer Engineering and Kyung-Cheol Moon of SNU College of Medicine

Interactions among cancerous tissue cells visualized into a map.

Patients’ 5-year survival rates accurately predicted using graph-based deep learning.

“Can also be applied to MRIs and X-rays…
will be a great help to the field of medical video data analysis.”

 

Artificial Intelligence (AI) excels in two representative fields, either matching or even surpassing human capabilities. These fields are Natural Language Processing (NLP) and Image Processing. NLP enables computers to comprehend human everyday conversations. It is the technology behind AI assistants like Siri, Bixby, Alexa, which effectively carry out commands from smart-device owners, and chatbots that handle citizen inquiries and counseling calls on behalf of banks and government offices. Voice interfaces that control machines through speech, Text To Speech (TTS) that reads text aloud in human-like voices, and Speech To Text (STT) which converts spoken language into written text, including cross-lingual communication and translation, are also widely used and integrated.

In 2016, AlphaGo astounded the world by defeating the world champion of Go was a revolutionary development in the field of image processing technology. By converting the 19 by 19 grid of a Go board into coordinates of 361 intersections, the machine was trained on victory game records' images. The Go AI learned the patterns that leads to victory from the countless game records and became smart enough to outperform human players. Furthermore, image processing technology has gained significant attention, especially in the field of medicine. With numerous images accumulated as medical data through devices like X-rays, CT scans, MRIs, ultrasound, and endoscopy, this technology is becoming invaluable in the medical field.

Korean engineers, with collaboration with medical professionals, developed a next-generation cancer diagnostic indicator using AI technology to identify "cancer cell networks" within cancer tissue images. This approach can predict the patient's 5-year survival rate with significantly improved accuracy. A "cancer cell network" is a map-based visual representation of interactions among cells within cancerous tissues. The engineers used graph theory, where individual cells serve as nodes and their relationships as links, to implement graph deep learning —a term that arose from the act of training graphs with AI. So far, deep learning technology focused only on learning and interpreting the appearance of localized cancer cells. However, for more precise prediction of cancer patient survival rates, understanding the relationships between different types of cells such as cancer cells, immune cells, and vascular cells is essential.

The interactions among cells within cancerous tissues are referred to as the cancer microenvironment. Immunotherapy, the forefront of cancer treatment, is deeply impacted by the cancer microenvironment, to the extent that the success of immunotherapy is determined by this environment. As such, the network itself becomes a survival prediction diagnostic indicator. Deriving the cancer microenvironment in a medically interpretable manner necessitates extensive data-based validation. However, given the inefficiency of direct human medical professionals performing this due to its sheer scale, the introduction of AI deep learning technology is crucial. Yet, due to the prior focus on the appearance of localized cancer cells, the inability to reflect the cancer microenvironment led to inconsistencies between AI-diagnosed cancer tissue and medical professionals' diagnostic methods. Professor Sung-Hoon Kwon from the Department of Electrical and Computer Engineering at Seoul National University, in collaboration with Professor Kyung-Cheol Moon and Professor Jung-Hwan Park from Seoul National University College of Medicine, has revealed a novel diagnostic indicator through joint research. In this study, they represent cancer tissue images as a graph of cellular interactions known as the "cancer cell network" and utilized graph-based deep learning technology that medical professionals can interpret. The joint research team has created a cancer cell network that not only captures the shape of cancer cells within cancerous tissues but also the interactions between the cells, and developed graph-based deep learning technology capable of simultaneously learning and interpreting interactions between cells for the first time in the world. Notably, by proposing interpretable graph-based deep learning technology, they have elucidated the cancer microenvironment, which influences patient survival rates. This contribution aids in enhancing the accuracy of real-world medical professionals' diagnoses by shedding light on the intricate interactions within the cancer microenvironment.
Through collaboration with Seoul National University Hospital, the research team succeeded in creating an AI capable of predicting cancer patients' survival rates. They revealed that the relationships between vascular formation within cancerous tissue and interactions among cancer cells and immune cells could serve as diagnostic indicators for survival rates when interpreted by AI. To achieve this, they initially depicted the entire network map using groups of cells (clusters) with similar shapes and close distances within cancer tissue images, each forming a node. Subsequently, using known 5-year survival rates of patients as a foundation, they trained AI on network patterns. The AI then could identify networks associated with lower survival rates. Finally, by dividing tissue images into 100,000 small patches and clustering nodes with low survival rate values using AI, the researchers were able to visualize the cancer cell network.
Dr.Yong-Joo Lee, the first author of the paper, along with the Ph.D. candidate Kyung-Seop Shin, mentioned, "The methodology developed for creating cancer cell networks and the graph deep learning technology are groundbreaking approaches applicable not only to cancer tissues but also to various medical imaging data such as MRI and X-rays. These methods will help illuminate crucial interactions in medical imaging data from diverse domains." Professors Jung-Hwan Park and So-Hee Oh from Seoul National University Boramae Medical Center, who conducted joint research as first authors, expressed their expectations, stating, "While interpretable deep learning models for medical professionals have been proposed before, our research is the first to suggest diagnostic indicators by incorporating intricate intercellular interactions. Microenvironments within cancerous tissues, including intercellular interactions, play a significant role in assessing cancer risk. This model, capable of presenting such interactions, is expected to contribute significantly to the discovery of new diagnostic indicators."
This study has been published in the internationally renowned academic journal 'Nature Biomedical Engineering.'

 

Source: https://ee.snu.ac.kr/community/news?bm=v&bbsidx=53555&page=3

Translated by: Do-Hyung Kim, English Editor of the Department of Electrical and Computer Engineering, kimdohyung@snu.ac.kr