Intensional method for recognizing objects described in a high-dimensional feature space
Keywords:
intensional method, family of recognition algorithms, interconnectedness of features, parameterization, extremal algorithmAbstract
The paper proposes a method that belongs to the group of intensional methods of pattern recognition, the essence of which is the transition from a system of initial features to a reduced system of models of interconnectedness between features. It is shown that the implementation of three options for using feature interconnection models serves as the basis for the development of three families of recognition algorithms: a family of recognition algorithms based on the use of feature interconnection models; a family of recognition algorithms based on the use of preferred models of feature interconnection; a family of recognition algorithms based on identifying preferred combinations of patterns of feature interconnection. The stages of specifying the proposed families of recognition algorithms are presented, and the problem of their parameterization is formulated. For these families of algorithms, procedures have been developed that ensure the construction of extreme algorithms for solving applied recognition problems in conditions of high dimensionality of the source data. A computer experiment was conducted to compare the effectiveness of the developed algorithms with the performance of alternative known recognition methods. The experiment showed that the developed recognition algorithms, in comparison with alternatives, have higher accuracy and performance when recognizing objects in the space of correlated features.
References
Журавлев Ю.И. Избранные научные труды.– М: Магистр,– 1998.– 420 с.
Журавлев Ю.И., Камилов М.М., Туляганов Ш.Е. Алгоритмы вычисления оценок и применение.– Ташкент: ФАН,– 1974.– 119 с.
Камилов М.М., Фазылов Ш.Х., Мирзаев Н.М., Раджабов С.С. Построение алгоритмов распознавания образов в пространстве признаков большой размерности. Часть 1. Мо дели распознающих операторов // Химическая технология. Контроль и управление. Ташкент,– 2012.– №3.– С. 52-59.
Раджабов С.С. Алгоритмы распознавания объектов, заданных в признаковом про странстве большой размерности /Информационные технологии, системный анализ и управление (ИТСАиУ-2012): Сб. тр. X Всерос. конф.– Таганрог: Изд-во ЮФУ,– 2012.– Т.3.– С. 244.
Фазылов Ш.Х., Мирзаев Н.М., Раджабов С.С. Построение модели алгоритмов вычис ления оценок с учётом большой размерности признакового пространства //Вестник СГТУ.– Саратов,– 2012.– № 1 (64).– С. 274–279.
Kamilov M., Fazilov S., Mirzaev N., Radjabov S. Estimates calculations algorithms in condition of huge dimensions of features’ space // Proceedings 4th International Conference "Problems of Cybernetics and Informatics"(PCI 2012). Baku,– 2012.– Vol. I.– P. 184–187.
Камилов М.М., Фазылов Ш.Х., Мирзаев Н.М., Раджабов С.С. Построение алгорит мов распознавания образов в пространстве признаков большой размерности. Часть 3. Определение параметров моделей распознающих операторов // Химическая техноло гия. Контроль и управление.– Ташкент,– 2012.– №5.– С. 51-61.
Fazilov Sh.Kh., Mirzaev N.M., Radjabov S.S., Mirzaeva G.R. Determination of representative features when building an extreme recognition algorithm // Journal of Physics: Conference Series.– London,– 2019.– Vol. 1260.– P. 1-8.
https://archive.ics.uci.edu/ml/datasets.php.
Prokhorenkova L., Gusev G., Vorobev A., Veronika A. CatBoost: unbiased boosting with categorical features.– https://arxiv.org/abs/1706.09516.
Griesbach C., Safken B., Waldmann E. Gradient Boosting for Linear Mixed Models. https://arxiv.org/abs/2011.00947.
Chen T., Guestrin C. XGBoost: A Scalable Tree Boosting System. https://arxiv.org/abs/1603.02754.
Ghojogh B., Crowley M. Linear and Quadratic Discriminant Analysis: Tutorial. https://arxiv.org/abs/1906.02590.
Freund Y., Schapire R.E. A Short Introduction to Boosting. http://www.site.uottawa.ca/ stan/csi5387/boost-tut-ppr.pdf.
Goldberger J., Roweis S., Hinton G., Salakhutdinov R. Neighbourhood Components Analysis //Advances in Neural Information Processing Systems,– Vol. 17, May– 2005.– P. 513–520.
Cristianini N., Shawe-Taylor J. An Introduction to Support Vector Machines and Other Kernel-based Learning Methods.– Cambridge University Press,– 2000.– ISBN 978-1-139 64363-4.
Ke G., Meng Q., Finley Th., Wang T., Chen W., Ma W., Ye Q., Liu T. LightGBM: A Highly Efficient Gradient Boosting Decision Tree. https://papers.nips.cc/paper/2017/file/6449f44a102fde848669bdd9eb6b76fa-Paper.pdf.
Zhang H. The optimality of Naive Bayes.– https://www.cs.unb.ca/ hzhang/publications/ FLAIRS04ZhangH.pdf.
Breiman L. Random Forests //Machine Learning: journal,– 2001.– Vol. 45,– no. 1.– P. 5–32.
Tolles J., Meurer W.J. Logistic Regression Relating Patient Characteristics to Outcomes //JAMA. 316 (5): 533–4. doi:10.1001/jama.2016.7653.
Rokach L. Data mining with decision trees: theory and applications.– World Scientific Pub Co Inc,– 2008.– ISBN 978-9812771711.
Liu Y., Chen M., Zhang W., Zhang J., Zheng Y. Federated Extra-Trees with Privacy Preserving.– https://arxiv.org/abs/2002.07323.
Saleh A., Arashi M., Golam B.M. Theory of Ridge Regression Estimation with Applications.– New York: John Wiley & Sons.– ISBN 978-1-118-64461-4.
Pedregosa et al. Scikit-learn: Machine Learning in Python, JMLR 12,– P. 2825–2830. 2011.
Dietterich T.G., Jain A., Lathrop R., Lozano-Perez T. A comparison of dynamic reposing and tangent distance for drug activity prediction //Advances in Neural Information Processing Systems, 6.– San Mateo, CA: Morgan Kaufmann,– 1994.– P. 216–223.
Ciarelli P.M., Oliveira E. Agglomeration and Elimination of Terms for Dimensionality Reduction /Ninth International Conference on Intelligent Systems Design and Applications,– P. 547–552.– 2009.
Velloso E., Bulling A., Gellersen H., Ugulino W., Fuks H. Qualitative Activity Recognition of Weight Lifting Exercises /Proceedings of 4th International Conference in Cooperation with SIGCHI (Augmented Human’13). Stuttgart, Germany: ACM SIGCHI,– 2013.
Fisher R.A. The use of multiple measurements in taxonomic problems //Annual Eugenics, 7, Part II,– P. 179-188.
											