k-nearest neighbour using ensemble clustering based on feature selection approach to learning relational data

Due to the growing amount of data generated and stored in relational databases, relational learning has attracted the interest of researchers in recent years.Many approaches have been developed in order to learn relational data.One of the approaches used to learn relational data is Dynamic Aggregati...

全面介紹

Saved in:
書目詳細資料
Main Authors: Alfred, Rayner, Shin, Kung Ke, Sainin, Mohd Shamrie, On, Chin Kim, Pandiyan, Paulraj Murugesa, Ag Ibrahim, Ag Asri
其他作者: Akagi, Masato
格式: Book Section
出版: Springer International Publishing 2016
主題:
在線閱讀:http://repo.uum.edu.my/20483/
http://doi.org/10.1007/978-3-319-49073-1_35
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:Due to the growing amount of data generated and stored in relational databases, relational learning has attracted the interest of researchers in recent years.Many approaches have been developed in order to learn relational data.One of the approaches used to learn relational data is Dynamic Aggregation of Relational Attributes (DARA).The DARA algorithm is designed to summarize relational data with one-to-many relations. However, DARA suffers a major drawback when the cardinalities of attributes are very high because the size of the vector space representation depends on the number of unique values that exist for all attributes in the dataset.A feature selection process can be introduced to overcome this problem.These selected features can be further optimized to achieve a good classification result.Several clustering runs can be performed for different values of k to yield an ensemble of clustering results. This paper proposes a two-layered genetic algorithm-based feature selection in order to improve the classification performance of learning relational database using a k-NN ensemble classifier.The proposed method involves the task of omitting less relevant features but retaining the diversity of the classifiers so as to improve the performance of the k-NN ensemble. The result shows that the proposed k-NN ensemble is able to improve the performance of traditional k-NN classifiers.