Correlation Engine 2.0
Clear Search sequence regions


Sizes of these terms reflect their relevance to your search.

The use of raw amino acid sequences as input for deep learning models for protein functional prediction has gained popularity in recent years. This scheme obliges to manage proteins with different lengths, while deep learning models require same-shape input. To accomplish this, zeros are usually added to each sequence up to a established common length in a process called zero-padding. However, the effect of different padding strategies on model performance and data structure is yet unknown. We propose and implement four novel types of padding the amino acid sequences. Then, we analysed the impact of different ways of padding the amino acid sequences in a hierarchical Enzyme Commission number prediction problem. Results show that padding has an effect on model performance even when there are convolutional layers implied. Contrastingly to most of deep learning works which focus mainly on architectures, this study highlights the relevance of the deemed-of-low-importance process of padding and raises awareness of the need to refine it for better performance. The code of this analysis is publicly available at https://github.com/b2slab/padding_benchmark .

Citation

Angela Lopez-Del Rio, Maria Martin, Alexandre Perera-Lluna, Rabie Saidi. Effect of sequence padding on the performance of deep learning models in archaeal protein functional prediction. Scientific reports. 2020 Sep 03;10(1):14634

Expand section icon Mesh Tags

Expand section icon Substances


PMID: 32884053

View Full Text