CONOPT

From NEOS

Jump to: navigation, search

General nonlinear programming models with sparse nonlinear constraints.

The basic algorithm in CONOPT is the generalized reduced gradient (GRG) algorithm with 25 years of enhancements. All matrix operations are implemented by using sparse matrix techniques to allow very large models. Without compromising the reliability of the GRG approach, the overhead of the GRG algorithm is minimized by, for example, using dynamic feasibility tolerances, reusing Jacobians whenever possible, and using an efficient reinversion routine. The algorithm uses many dynamically set tolerances and therefore runs, in most cases, with default parameters.

CONOPT is available as a NLP solver for most modeling systems, e.g. AIMMS, AMPL, GAMS, LINDO/LINGO, MPL, and TOMLAB. CONOPT is also available as a subroutine library or DLL, but the modeling system versions are recommended for all but the most sophisticated power-users.

The system is continuously being updated, mainly to improve reliability and efficiency on large models and on special classes of models. There is a sub-components for very large square sets of nonlinear equations (over 1 mill variables and equations), a Sequential Linear Programming component for almost linear models (also useful while finding a feasible solution), a Sequential Quadratic Programming component for models with many degrees of freedom (can use different types of 2nd derivative information), and a steepest edge component for very difficult models. The choice between the components is in most case done dynamically based on performance statistics.

Need more info? Contact the suppliers of one of the modeling systems mentioned above, or

ARKI Consulting and Development A/S Bagsvaerdvej 246 A DK-2880 Bagsvaerd Denmark Phone (+45) 44 49 03 23 Fax (+45) 44 49 03 33 info@arki.dk

Reference: A. S. Drud, CONOPT -- A Large Scale GRG Code, ORSA Journal on Computing 6 (1994), pp. 207-216.

Personal tools