Abstract
We present a framework for discriminative sequence classification where linear classifiers work directly in the explicit high-dimensional predictor space of all subsequences in the training set (as opposed to kernel-induced spaces). This is made feasible by employing a gradient-bounded coordinatedescent algorithm for efficiently selecting discriminative subsequences without having to expand the whole space. Our framework can be applied to a wide range of loss functions, including binomial log-likelihood loss of logistic regression and squared hinge loss of support vector machines. When applied to protein remote homology detection and remote fold recognition, our framework achieves comparable performance to the state-of-the-art (e.g., kernel support vector machines). In contrast to state-of-the-art sequence classifiers, our models are simply lists of weighted discriminative subsequences and can thus be interpreted and related to the biological problem - a crucial requirement for the bioinformatics and medical communities.
Original language | English |
---|---|
Title of host publication | Proceedings of the 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD'11 |
Number of pages | 9 |
Publication date | 16 Sept 2011 |
Pages | 708-716 |
ISBN (Print) | 9781450308137 |
DOIs | |
Publication status | Published - 16 Sept 2011 |
Externally published | Yes |
Event | 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD'11 - San Diego, CA, United States Duration: 21 Aug 2011 → 24 Aug 2011 |
Conference
Conference | 17th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD'11 |
---|---|
Country/Territory | United States |
City | San Diego, CA |
Period | 21/08/2011 → 24/08/2011 |
Sponsor | ACM Spec. Interest Group Knowl. Discov. Data (SIGKDD), ACM SIGMOD |
Keywords
- Greedy coordinate-descent
- Logistic regression
- Sequence classification
- String classification
- Support vectormachines