In this paper, we report data and experiments related to the research article entitled “An adaptive truncation criterion, for linesearch-based truncated Newton methods in large scale nonconvex optimization” by Caliciotti et. Al. [1]. In particular, in [1], large scale unconstrained optimization...
Nonlinear Optimization
-
-
The paper is concerned with multiobjective sparse optimization problems, i.e. the problem of simultaneously optimizing several objective functions and where one of these functions is the number of the non-zero components (or the ℓ-norm) of the solution. We propose to deal with the ℓ-norm by means...
-
This paper includes a twofold result for the Nonlinear Conjugate Gradient (NCG) method, in large scale unconstrained optimization. First we consider a theoretical analysis, where preconditioning is embedded in a strong convergence framework of an NCG method from the literature. Mild conditions to...
-
We introduce a class of positive definite preconditioners for the solution of large symmetric indefinite linear systems or sequences of such systems, in optimization frameworks. The preconditioners are iteratively constructed by collecting information on a reduced eigenspace of the indefinite...
-
In this paper, we deal with matrix-free preconditioners for nonlinear conjugate gradient (NCG) methods. In particular, we review proposals based on quasi-Newton updates, and either satisfying the secant equation or a secant-like equation at some of the previous iterates. Conditions are given...
-
In this work, we deal with Truncated Newton methods for solving large scale (possibly nonconvex) unconstrained optimization problems. In particular,we consider the use of amodified Bunch and Kaufman factorization for solving the Newton equation, at each (outer) iteration of the method. The Bunch...
-
In this paper we propose the use of damped techniques within Nonlinear Conjugate Gradient (NCG) methods. Damped techniques were introduced by Powell and recently reproposed by Al-Baali and till now, only applied in the framework of quasi--Newton methods. We extend their use to NCG methods in large...
-
Starting from the paper by Nash and Sofer (1990), we propose a heuristic adaptive truncation criterionfor the inner iterations within linesearch-based truncated Newton methods. Our aim is to possibly avoid ‘‘over-solving’’ of the Newton equation, based on a comparison between the predicted...
-
In this paper we study new preconditioners to be used within the nonlinear conjugate gradient (NCG) method, for large scale unconstrained optimization. The rationale behind our proposal draws inspiration from quasi-Newton updates, and its aim is to possibly approximate in some sense the inverse of...
-
In this paper, mixed-integer nonsmooth constrained optimization problems are considered, where objective/constraint functions are available only as the output of a black-box zeroth-order oracle that does not provide derivative information. A new derivative-free linesearch-based algorithmic...