A Survey on Properties of Adaptive Boosting with Different Classifiers
Journal: International Journal of Science and Research (IJSR) (Vol.6, No. 4)Publication Date: 2017-04-05
Authors : Sumayya P; Shahad P;
Page : 1514-1516
Keywords : Adaboost; AdaBoostSVM; RBFSVM; Naive Bayes; Decision trees; Decision stump;
Abstract
Adaboost or Adaptive boosting is a recently using boosting technique in machine learning. Generally boosting is used to improve the performance of various learning algorithms. It is mainly used in conjunction with different classifiers. This paper takes a survey based on the properties shown by adaboost with different base classifiers like, support vector machine, naive bayes, decision trees and decision stump. Boosting of these classifiers with adaboost shows difference in properties such as accuracy, error rate, performance etc. Adaptive boosting is widely used in various fields of data mining and image recognition. Now a days it is very helpful in the prediction and diagnosis of particular disease patterns with maximum degree of accuracy by converting a weak learner into a strong learner
Other Latest Articles
- Clinical Efficacy of Whelping Induction Protocol using Mifepristone in Advanced Pregnant Bitches
- Detection of Rotavirus by ELISA in Stool Samples of Hospitalised Children
- Fabrication of Cobalt Phthalocyanine Based OFETs and its Application in Voltage Converters
- CFD Analysis of Exhaust Heat-Exchanger in Automobile Thermoelectric Generator
- Detection of Lung Tumor in MR Images using Modified Pillar K-Means Algorithm with Gabor Filter and Color Mapping
Last modified: 2021-06-30 18:32:29