ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

Feature Selection for Global Redundancy Minimization Using Regularized Trees

Journal: International Journal of Science and Research (IJSR) (Vol.5, No. 2)

Publication Date:

Authors : ; ;

Page : 998-1002

Keywords : Feature selection; Feature Ranking; Redundancy Minimization; Regularized Trees;

Source : Downloadexternal Find it from : Google Scholarexternal

Abstract

Feature selection has been a critical exploration subject in data mining, in light of the fact that the genuine data sets frequently have high dimensional elements, for example, the bioinformatics and content mining applications. Numerous current channel feature selection systems rank elements by advancing certain element ranking rules, such that related elements regularly have comparable rankings. These connected components are repetitive and don't give substantial shared data to help data mining. Therefore, when we select a set number of components, we plan to choose the top non-repetitive elements such that the valuable common data can be amplified. In past exploration, Ding et al. perceived this essential issue and proposed the base Redundancy Maximum Relevance Feature Selection (mRMR) model to minimize the repetition between consecutively chose features. In any case, this system utilized the eager hunt, along these lines the worldwide component repetition wasn't considered and the outcomes are not ideal. In this paper, we propose another element selection structure to all inclusive minimize the component excess with boosting the given element ranking scores, which can originate from any regulated or unsupervised strategies. Our new model has no parameter with the goal that it is particularly suitable for functional data mining application. We propose a tree regularization structure, which empowers numerous tree models to perform feature selection efficiently. The key thought of the regularization system is to punish selecting another component for part when its increase (e. g. data increase) is like the elements utilized as a part of past parts. The regularization structure is connected on irregular woods and helped trees here, and can be effectively connected to other tree models. Test studies demonstrate that the regularized trees can choose fantastic element subsets with respect to both solid and frail classifiers. Since tree models can normally manage clear cut and numerical variables, missing qualities, distinctive scales between variables, connections and nonlinearities and so forth. The tree regularization system gives a compelling and effective feature selection answer for many practical problems.

Last modified: 2021-07-01 14:31:22