A New Interpolation Approach for Linearly Constrained Convex Optimization

  • Francisco Espinoza

Student thesis: Master's Thesis

Abstract

In this thesis we propose a new class of Linearly Constrained Convex Optimization methods based on the use of a generalization of Shepard's interpolation formula. We prove the properties of the surface such as the interpolation property at the boundary of the feasible region and the convergence of the gradient to the null space of the constraints at the boundary. We explore several descent techniques such as steepest descent, two quasi-Newton methods and the Newton's method. Moreover, we implement in the Matlab language several versions of the method, particularly for the case of Quadratic Programming with bounded variables. Finally, we carry out performance tests against Matab Optimization Toolbox methods for convex optimization and implementations of the standard log-barrier and active-set methods. We conclude that the steepest descent technique seems to be the best choice so far for our method and that it is competitive with other standard methods both in performance and empirical growth order.
Date of AwardAug 2012
Original languageEnglish (US)
Awarding Institution
  • Computer, Electrical and Mathematical Sciences and Engineering
SupervisorAlyn Rockwood (Supervisor)

Keywords

  • Optimization
  • Convex
  • Interpolation
  • Shepard's method
  • Linear constraints
  • Quadratic programming

Cite this

'