Document Type

Conference Proceeding

Publication Date

2000

Abstract

The concept of jumping emerging patterns (JEPs) has been proposed to describe those discriminating features which only occur in the positive training instances but do not occur in the negative class at all; JEPs have been used to construct classifiers which generally provide better accuracy than the state-of-the-art classifiers such as C4.5. The algorithms for maintaining the space of jumping emerging patterns (JEP space) are presented in this paper. We prove that JEP spaces satisfy the property of convexity. Therefore JEP spaces can be concisely represented by two bounds: consisting respectively of the most general elements and the most specific elements. In response to insertion of new training instances, a JEP space is modified by operating on its boundary elements and the boundary elements of the JEP spaces associated with the new instances. This strategy completely avoids the need to go back to the most initial step to build the new JEP space. In addition, our maintenance algorithms can well handle such other cases as deletion of instances, insertion of new attributes, and deletion of attributes.

Comments

Presented at the Seventeenth International Conference on Machine Learning (ICML), Stanford, CA, June 29-July 2, 2000.


Share

COinS