DeepACLSTM: deep asymmetric convolutional long short-term memory neural models for protein secondary structure prediction. 2019

Yanbu Guo, and Weihua Li, and Bingyi Wang, and Huiqing Liu, and Dongming Zhou
School of Information Science and Engineering, Yunnan University, Kunming, 650091, China.

BACKGROUND Protein secondary structure (PSS) is critical to further predict the tertiary structure, understand protein function and design drugs. However, experimental techniques of PSS are time consuming and expensive, and thus it's very urgent to develop efficient computational approaches for predicting PSS based on sequence information alone. Moreover, the feature matrix of a protein contains two dimensions: the amino-acid residue dimension and the feature vector dimension. Existing deep learning based methods have achieved remarkable performances of PSS prediction, but the methods often utilize the features from the amino-acid dimension. Thus, there is still room to improve computational methods of PSS prediction. RESULTS We propose a novel deep neural network method, called DeepACLSTM, to predict 8-category PSS from protein sequence features and profile features. Our method efficiently applies asymmetric convolutional neural networks (ACNNs) combined with bidirectional long short-term memory (BLSTM) neural networks to predict PSS, leveraging the feature vector dimension of the protein feature matrix. In DeepACLSTM, the ACNNs extract the complex local contexts of amino-acids; the BLSTM neural networks capture the long-distance interdependencies between amino-acids. Furthermore, the prediction module predicts the category of each amino-acid residue based on both local contexts and long-distance interdependencies. To evaluate performances of DeepACLSTM, we conduct experiments on three publicly available datasets: CB513, CASP10 and CASP12. Results indicate that the performance of our method is superior to the state-of-the-art baselines on three publicly datasets. CONCLUSIONS Experiments demonstrate that DeepACLSTM is an efficient predication method for predicting 8-category PSS and has the ability to extract more complex sequence-structure relationships between amino-acid residues. Moreover, experiments also indicate the feature vector dimension contains the useful information for improving PSS prediction.

UI MeSH Term Description Entries
D008962 Models, Theoretical Theoretical representations that simulate the behavior or activity of systems, processes, or phenomena. They include the use of mathematical equations, computers, and other electronic equipment. Experimental Model,Experimental Models,Mathematical Model,Model, Experimental,Models (Theoretical),Models, Experimental,Models, Theoretic,Theoretical Study,Mathematical Models,Model (Theoretical),Model, Mathematical,Model, Theoretical,Models, Mathematical,Studies, Theoretical,Study, Theoretical,Theoretical Model,Theoretical Models,Theoretical Studies
D011506 Proteins Linear POLYPEPTIDES that are synthesized on RIBOSOMES and may be further modified, crosslinked, cleaved, or assembled into complex proteins with several subunits. The specific sequence of AMINO ACIDS determines the shape the polypeptide will take, during PROTEIN FOLDING, and the function of the protein. Gene Products, Protein,Gene Proteins,Protein,Protein Gene Products,Proteins, Gene
D000072417 Protein Domains Discrete protein structural units that may fold independently of the rest of the protein and have their own functions. Peptide Domain,Protein Domain,Domain, Peptide,Domain, Protein,Domains, Peptide,Domains, Protein,Peptide Domains
D000077321 Deep Learning Supervised or unsupervised machine learning methods that use multiple layers of data representations generated by nonlinear transformations, instead of individual task-specific ALGORITHMS, to build and train neural network models. Hierarchical Learning,Learning, Deep,Learning, Hierarchical
D000465 Algorithms A procedure consisting of a sequence of algebraic formulas and/or logical steps to calculate or determine a given task. Algorithm
D016571 Neural Networks, Computer A computer architecture, implementable in either hardware or software, modeled after biological neural networks. Like the biological system in which the processing capability is a result of the interconnection strengths between arrays of nonlinear processing nodes, computerized neural networks, often called perceptrons or multilayer connectionist models, consist of neuron-like units. A homogeneous group of units makes up a layer. These networks are good at pattern recognition. They are adaptive, performing tasks by example, and thus are better for decision-making than are linear learning machines or cluster analysis. They do not require explicit programming. Computational Neural Networks,Connectionist Models,Models, Neural Network,Neural Network Models,Neural Networks (Computer),Perceptrons,Computational Neural Network,Computer Neural Network,Computer Neural Networks,Connectionist Model,Model, Connectionist,Model, Neural Network,Models, Connectionist,Network Model, Neural,Network Models, Neural,Network, Computational Neural,Network, Computer Neural,Network, Neural (Computer),Networks, Computational Neural,Networks, Computer Neural,Networks, Neural (Computer),Neural Network (Computer),Neural Network Model,Neural Network, Computational,Neural Network, Computer,Neural Networks, Computational,Perceptron
D017433 Protein Structure, Secondary The level of protein structure in which regular hydrogen-bond interactions within contiguous stretches of polypeptide chain give rise to ALPHA-HELICES; BETA-STRANDS (which align to form BETA-SHEETS), or other types of coils. This is the first folding level of protein conformation. Secondary Protein Structure,Protein Structures, Secondary,Secondary Protein Structures,Structure, Secondary Protein,Structures, Secondary Protein

Related Publications

Yanbu Guo, and Weihua Li, and Bingyi Wang, and Huiqing Liu, and Dongming Zhou
January 2021, PloS one,
Yanbu Guo, and Weihua Li, and Bingyi Wang, and Huiqing Liu, and Dongming Zhou
January 2016, Scientific reports,
Yanbu Guo, and Weihua Li, and Bingyi Wang, and Huiqing Liu, and Dongming Zhou
January 2023, Frontiers in bioengineering and biotechnology,
Yanbu Guo, and Weihua Li, and Bingyi Wang, and Huiqing Liu, and Dongming Zhou
May 2023, Proteins,
Yanbu Guo, and Weihua Li, and Bingyi Wang, and Huiqing Liu, and Dongming Zhou
January 2020, Journal of medical signals and sensors,
Yanbu Guo, and Weihua Li, and Bingyi Wang, and Huiqing Liu, and Dongming Zhou
March 2017, Bioinformatics (Oxford, England),
Yanbu Guo, and Weihua Li, and Bingyi Wang, and Huiqing Liu, and Dongming Zhou
November 2019, RSC advances,
Yanbu Guo, and Weihua Li, and Bingyi Wang, and Huiqing Liu, and Dongming Zhou
October 2021, Proteins,
Yanbu Guo, and Weihua Li, and Bingyi Wang, and Huiqing Liu, and Dongming Zhou
August 2022, Proteins,
Yanbu Guo, and Weihua Li, and Bingyi Wang, and Huiqing Liu, and Dongming Zhou
July 2020, Annual International Conference of the IEEE Engineering in Medicine and Biology Society. IEEE Engineering in Medicine and Biology Society. Annual International Conference,
Copied contents to your clipboard!