0
A preconditioned second-order convex splitting algorithm with extrapolation
arXiv:2512.14468v1 Announce Type: cross
Abstract: Nonconvex optimization problems are widespread in modern machine learning and data science. We introduce an extrapolation strategy into a class of preconditioned second-order convex splitting algorithms for nonconvex optimization problems. The proposed algorithms combine second-order backward differentiation formulas (BDF2) with an extrapolation method. Meanwhile, the implicit-explicit scheme simplifies the subproblem through a preconditioned process. As a result, our approach solves nonconvex problems efficiently without significant computational overhead. Theoretical analysis establishes global convergence of the algorithms using Kurdyka-\L ojasiewicz properties. Numerical experiments include a benchmark problem, the least squares problem with SCAD regularization, and an image segmentation problem. These results demonstrate that our algorithms are highly efficient, as they achieve reduced solution times and competitive performance.
Abstract: Nonconvex optimization problems are widespread in modern machine learning and data science. We introduce an extrapolation strategy into a class of preconditioned second-order convex splitting algorithms for nonconvex optimization problems. The proposed algorithms combine second-order backward differentiation formulas (BDF2) with an extrapolation method. Meanwhile, the implicit-explicit scheme simplifies the subproblem through a preconditioned process. As a result, our approach solves nonconvex problems efficiently without significant computational overhead. Theoretical analysis establishes global convergence of the algorithms using Kurdyka-\L ojasiewicz properties. Numerical experiments include a benchmark problem, the least squares problem with SCAD regularization, and an image segmentation problem. These results demonstrate that our algorithms are highly efficient, as they achieve reduced solution times and competitive performance.