CLOP-hERG: The Contrastive Learning Optimized Pre-trained Model for Representation Learning in Predicting Drug-Induced hERG Channel Blockers
DOI:
https://doi.org/10.47852/bonviewMEDIN42022049Keywords:
hERG, contrastive learning, RoBERTa, representation learningAbstract
During drug development, ensuring that drug molecules do not block the hERG (human Ether-à-go-go-Related Gene) channel is critical. If this channel is blocked, it can cause many cardiovascular-related diseases. However, traditional experimental detection methods are expensive and time-consuming. In this work, we proposed a novel deep learning framework CLOP-hERG that combines contrastive learning with the RoBERTa pre-trained model to predict whether drug molecules will block the hERG channel. We employed data augmentation techniques on molecular structures to ensure that our model can capture the multifaceted information of the molecules. Besides, we used a contrastive learning strategy to enable the model to learn meaningful molecular features from large unlabeled datasets. The RoBERTa pre-trained model played a pivotal role in this process, giving our model with a robust representational learning capability. The model, obtained through contrastive learning, was further fine-tuned to enhance high-precision prediction of hERG channel blockers. Through a series of experiments, we demonstrated the effectiveness of CLOP-hERG. This work provides a novel and an effective strategy for predicting hERG blockers and provides some insights for other similar pharmaceutical tasks.
Received: 13 November 2023 | Revised: 21 December 2023 | Accepted: 29 December 2023
Conflicts of Interest
The authors declare that they have no conflicts of interest to this work.
Data Availability Statement
The CLOP-hERG data that support the findings of this study are openly available at https://github.com/heshida01/CLOP-hERG. The Therapeutics Data Commons data that support the findings of this study are openly available at https://tdcommons.ai/, reference number [21].
Downloads
Published
Issue
Section
License
Copyright (c) 2024 Authors
This work is licensed under a Creative Commons Attribution 4.0 International License.
How to Cite
Funding data
-
Japan Society for the Promotion of Science
Grant numbers JP23H03411;JP22K12144 -
Japan Science and Technology Corporation
Grant numbers JPMJPF2017 -
Japan Science and Technology Agency
Grant numbers JPMJSP2124