Second-order adjoint sensitivities
The authors show in this chapter, two different approaches for efficiently estimating the second-order derivatives (Hessian matrix) of a given objective function. The cost of evaluating the Hessian using classical finite difference approach is O(n2) where n is the number of parameters. The first adjoint approach reduces the cost of estimating all components of the Hessian matrix to only 2n extra simulations. This approach is simple, and it uses the algorithms developed in previous chapters. A second approach for estimating the complete Hessian is also presented. This approach is more complex than the first approach and requires extra memory storage. This approach requires only n + 1 extra simulations per Hessian evaluation. It follows that the computational cost is approximately one half of the first adjoint approach. This saving comes at the cost of a more complex algorithm and more extensive storage.