ResearchBib Share Your Research, Maximize Your Social Impacts
Sign for Notice Everyday Sign up >> Login

METHOD FOR HYPERPARAMETER TUNING IN MACHINE LEARNING TASKS FOR STOCHASTIC OBJECTS CLASSIFICATION

Journal: Scientific and Technical Journal of Information Technologies, Mechanics and Optics (Vol.20, No. 5)

Publication Date:

Authors : ;

Page : 667-676

Keywords : hyperparameters tuning; machine learning; multiclass gradient boosting classifier; multiclass SVM-classifier; SVregression; gradient boosting regression; Nelder-Mead method;

Source : Downloadexternal Find it from : Google Scholarexternal

Abstract

Subject of Research. The paper presents a simple and practically effective solution for hyperparameter tuning in classification problem by machine learning methods. The proposed method is applicable for any hyperparameters of the real type with the values which lie within the known real parametric compact. Method. A random sample (trial network) of small power is generated within the parametric compact, and the efficiency of hyperparameter tuning is calculated for each element according to a special criterion. The efficiency is estimated by the value of a real scalar, which does not depend on the classification threshold. Thus, a regression sample is formed, the regressors of which are the random sets of hyperparameters from the parametric compact, and regression values are classification efficiency indicator values corresponding to these sets. The nonparametric approximation of this regression is constructed on the basis of the formed data set. At the next stage the minimum value of the constructed approximation is determined for the regression function on the parametric compact by the Nelder-Mead optimization method. The arguments of the minimum regression value appear to be an approximate solution to the problem. Main Results. Unlike traditional approaches, the proposed approach is based on non-parametric approximation of the regression function: a set of hyperparameters – classification efficiency index value. Particular attention is paid to the choice of the classification quality criterion. Due to the use of the mentioned type approximation, it is possible to study the performance indicator behavior out of the trial grid values (“between” its nodes). As it follows from the experiments carried out on various databases, the proposed approach provides a significant increase in the efficiency of hyperparameter tuning in comparison with the basic variants and at the same time maintains almost acceptable performance even for small values of the trial grid power. The novelty of the approach lies in the simultaneous use of non-parametric approximation for the regression function, which links the hyperparameter values with the corresponding values of the quality criterion, selection of the classification quality criterion, and search method for the global extremum of this function. Practical Relevance. The proposed algorithm for hyperparameters tuning can be used in any systems built on the principles of machine learning, for example, in process control systems, biometric systems and machine vision systems.

Last modified: 2020-10-26 20:12:51