![]() ![]() Yuan, G.-X., Ho, C.-H., Lin, C.-J.: An improved GLMNET for L1-regularized logistic regression. Springer, New York (2014)įriedman, J., Hastie, T., Höfling, H., Tibshirani, R.: Pathwise coordinate optimization. ![]() Izmailov, A.F., Solodov, M.V.: Newton-Type Methods for Optimization and Variational Problems. 14, 877–898 (1976)įacchinei, F., Pang, J.-S.: Finite-Dimesional Variational Inequalities and Complementarity Problems. Rockafellar, R.T.: Monotone operators and the proximal point algorithm. 24, 1420–1443 (2014)įukushima, M., Mine, H.: A generalized proximal point algorithm for certain non-convex minimization problems. Lee, J.D., Sun, Y., Saunders, M.A.: Proximal Newton-type methods for minimizing composite functions. Ye, J.J., Yuan, X., Zeng, S., Zhang, J.: Variational analysis perspective on linear convergence of some first order methods for nonsmooth convex optimization problems. Tseng, P.: Approximation accuracy, gradient methods, and error bound for structured convex optimization. Necoara, I., Nesterov, Yu., Glineur, F.: Linear convergence of first order methods for non-strongly convex optimization. Luo, Z.-Q., Tseng, P.: On the linear convergence of descent methods for convex essentially smooth minimization. Li, G., Pong, T.K.: Calculus of the exponent of Kurdyka-Łojasiewicz inequality and its applications to linear convergence of first-order methods. (eds.) Advances in Neural Information Processing Systems 24, pp. ![]() Schmidt, M., Roux, N., Bach, F.: Convergence rates of inexact proximal-gradient methods for convex optimization. Nesterov, Yu.: Lectures on Convex Optimization, 2nd edn. Beck, A., Teboulle, M.: A fast iterative shrinkage-thresholding algorithm for linear inverse problems. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |