BetaTPred2: Prediction of ß-turns in proteins using neural
networks and multiple alignment

OSDDlinux for Standalone, Galaxy & Local version
Bioinformatics Centre, IMTECH, Chandigarh

This is the main page which gives an introduction of BetaTPred2 Here you can submit your query sequence for ß-turn prediction This page gives a detailed description of ß-turns, neural networks, multiple alignment and performance measures used How to use BetaTPred2 Related references Contact us

The aim of BetaTPred2 server is to predict ß turns in proteins from multiple alignment by using neural network from the given amino acid sequence. For ß turn prediction, it uses the position specific score matrices generated by PSI-BLAST and secondary structure predicted by PSIPRED. The net is trained and tested on a set of 426 non-homologous protein chains with 7-fold cross-validation. It predicts ß turns in proteins with residue accuracy of 75.5% and MCC value of 0.43.

Two feed-forward neural networks with a single hidden layer are used where the first sequence-structure network is trained with the multiple alignment in the form of PSI-BLAST generated position specific scoring matrices. The initial predictions from the first network and PSIPRED predicted secondary structure are used as input to the second structure-structure network to refine the predictions obtained from the first net. The learning algorithm is standard backpropagation and a linear activation function is used.

The input is a single letter-code amino acid sequence either in fasta or free format. The residues in the query sequence predicted as turns are marked as 't' and non-turn residues are marked as 'n'. The PSIPRED predicted secondary structure is displayed along with the predicted turn/non-turn results.

Please cite the following paper if you are using BetaTPred2 for your research:

Kaur, H. and Raghava, G.P.S. Prediction of beta-turns in proteins from multiple alignment using neural network. Protein Science 2003 12: 627-634

[Home] [Submit your sequence] [About server] [Help] [References] [Who are we?]