# The Gradient Subspace Approximation for Scalar Optimization

#### Track

### Set Oriented Numerics

Nowadays, the evolutionary algorithms have been widely used in order to approximate the solutions for optimization problems. The so-called memetic algorithms, hybridize evolutionary algorithms along with mathematical programing techniques in order to improve the convergence rate of the algorithm. However, a drawback of these methods is that in some cases the cost (in terms of function calls) of the mathematical programming techniques is higher than the cost of evolutionary mechanism. Furthermore, in some optimization problems the explicit gradient information that some of these techniques require is missing. In order to reduce the cost of the mathematical programming techniques, we propose a method that exploits the data calculated by the evolutionary algorithms. Given a candidate solution, the Gradient Subspace Approximation (GSA) aims to compute the most greedy direction exploiting the neighborhood information of the solution. Since the GSA uses information that is already calculated, the cost of this method in terms of function evaluations can be competitive in comparison with the mechanism of the evolutionary algorithm. In this study we perform an analysis of the GSA method and its behavior as the mathematical technique used within a memetic algorithm. A first approach of a memetic algorithm obtained via Dierential Evolution (DE) and GSA have shown promising results.

**Tagged under**

### Related items

- Parameter free Optimization algorithm for Garch models
- Pareto Explorer for the Local Exploration of Many Objective Optimization Problems
- Archivers for the Set of Approximate Solutions on Multi-objective Optimization
- Hypervolume Newton method as a Local Searcher for Indicator based Evolutionary Algorithm

- Speaker:
- Sergio Alvarado