Izvestiya of Saratov University.

Mathematics. Mechanics. Informatics

ISSN 1816-9791 (Print)
ISSN 2541-9005 (Online)


For citation:

Gudkov A. A., Sidorov S. P., Spiridonov K. A. Dual active-set algorithm for optimal 3-monotone regression. Izvestiya of Saratov University. Mathematics. Mechanics. Informatics, 2022, vol. 22, iss. 2, pp. 216-223. DOI: 10.18500/1816-9791-2022-22-2-216-223, EDN: MEHLKW

This is an open access article distributed under the terms of Creative Commons Attribution 4.0 International License (CC-BY 4.0).
Published online: 
31.05.2022
Full text:
(downloads: 1447)
Language: 
English
Heading: 
Article type: 
Article
UDC: 
519.85
EDN: 
MEHLKW

Dual active-set algorithm for optimal 3-monotone regression

Autors: 
Gudkov Alexandr A., Saratov State University
Sidorov Sergei Petrovich, Saratov State University
Spiridonov Kirill A., Saratov State University
Abstract: 

The paper considers a shape-constrained optimization problem of constructing monotone regression which has gained much attention over the recent years. This paper presents the results of constructing the nonlinear regression with $3$-monotone constraints. Monotone regression of high orders can be applied in many fields, including non-parametric mathematical statistics and empirical data smoothing. In this paper, an iterative algorithm is proposed for constructing a sparse $3$-monotone regression, i.e. for finding a $3$-monotone vector with the lowest square error of approximation to a given (not necessarily $3$-monotone) vector. The problem can be written as a convex programming problem with linear constraints. It is proved that the proposed dual active-set algorithm has polynomial complexity and obtains the optimal solution.

Acknowledgments: 
This work was supported by the Ministry of science and education of the Russian Federation in the framework of the basic part of the scientific research state task (project FSRR-2020-0006).
References: 
  1. Chen Y. Aspects of Shape-constrained Estimation in Statistics. Ph. D. thesis, University of Cambridge, 2013. 143 p.
  2. Burdakov O., Sysoev O. A dual active-set algorithm for regularized monotonic regression. Journal of Optimization Theory and Applications, 2017, vol. 172, no. 3, pp. 929–949. https://doi.org/10.1007/s10957-017-1060-0
  3. Robertson T., Wright F., Dykstra R. Order Restricted Statistical Inference. John Wiley & Sons, New York, 1988. 488 p.
  4. Dykstra R. An isotonic regression algorithm. Journal of Statistical Planning and Inference, 1981, vol. 5, iss. 4, pp. 355–363. https://doi.org/10.1016/0378-3758(81)90036-7
  5. Bach F. R. Efficient algorithms for non-convex isotonic regression through submodular optimization. In: S. Bengio, H. M. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, R. Garnett, eds. Advances in Neural Information Processing Systems 32nd: Annual Conference on Neural Information Processing Systems 2018 (NeurIPS 2018). December 3–8, 2018. Montreal, Canada, 2018, pp. 1–10.
  6. Hastie T., Tibshirani R., Wainwright M. Statistical Learning with Sparsity: The Lasso and Generalizations. New York, USA, Chapman and Hall/CRC, 2015. 367 p. https://doi.org/10.1201/b18401  
  7. Altmann D., Grycko E., Hochstattler W., Klutzke G. Monotone smoothing of noisy data. Diskrete Mathematik und Optimierung. Technical Report feu-dmo034.15. FernUniversitat in Hagen, Fakultat fur Mathematik und Informatik, 2014. 6 p.
  8. Diggle P., Morris S., Morton-Jones T. Case-control isotonic regression for investigation of elevation in risk around a point source. Statistics in Medicine, 1999, vol. 18, iss. 13, pp. 1605–1613. https://doi.org/10.1002/(sici)1097-0258(19990715)18:13<1605::aid-sim146>3.0.co;2-v
  9. Cai Y., Judd K. L. Chapter 8 – Advances in numerical dynamic programming and new applications. Handbook of Computational Economics, 2014, vol. 3, pp. 479–516. https://doi.org/10.1016/b978-0-444-52980-0.00008-6
  10. Shevaldin V. T. Approksimatsiya lokal’nymi splainami [Approximation by Local Splines]. Ekaterinburg, UMC UPI Publ., 2014. 198 p. (in Russian).
  11. Boytsov D. I., Sidorov S. P. Linear approximation method preserving k-monotonicity. Siberian Electronic Mathematical Reports, 2015, vol. 12, pp. 21–27.
  12. Milovanovic I. Z., Milovanovic E. I. Some properties of lkp-convex sequences. Bulletin of the International Mathematical Virtual Institute, 2015, vol. 5, no. 1, pp. 33–36.
  13. Niezgoda M. Inequalities for convex sequences and nondecreasing convex functions. Aequationes Mathematicae, 2017, vol. 91, no. 1, pp. 1–20. https://doi.org/10.1007/s00010-016-0444-9
  14. Latreuch Z., Belaidi B. New inequalities for convex sequences with applications. International Journal of Open Problems in Computer Science and Mathematics, 2012, vol. 5, no. 3, pp. 15–27. https://doi.org/10.12816/0006115
  15. Marshall A. W., Olkin I., Arnold B. C. Inequalities: Theory of Majorization and Its Applications. New York, USA, Springer, 2011. 909 p. https://doi.org/10.1007/978-0-387-68276-1
  16. Gudkov A., Mironov S. V., Sidorov S. P., Tyshkevich S. V. A dual active set algorithm for optimal sparse convex regression. Journal of Samara State Technical University, Ser. Physical and Mathematical Sciences, 2019, vol. 23, no. 1, pp. 113–130. https://doi.org/10.14498/vsgtu1673
  17. Sidorov S. P., Faizliev A. R., Gudkov A. A., Mironov S. V. Algorithms for sparse k-monotone regression. In: W. J. van Hoeve, ed. Integration of Constraint Programming, Artificial Intelligence, and Operations Research. CPAIOR 2018. Lecture Notes in Computer Science, vol. 10848. Springer, Cham, 2018, pp. 546–566. https://doi.org/10.1007/978-3-319-93031-2_39
Received: 
03.12.2021
Accepted: 
15.01.2022
Published: 
31.05.2022