WolfeEvolutionVerifierMetrics.java

  1. package org.drip.function.rdtor1descent;

  2. /*
  3.  * -*- mode: java; tab-width: 4; indent-tabs-mode: nil; c-basic-offset: 4 -*-
  4.  */

  5. /*!
  6.  * Copyright (C) 2020 Lakshmi Krishnamurthy
  7.  * Copyright (C) 2019 Lakshmi Krishnamurthy
  8.  * Copyright (C) 2018 Lakshmi Krishnamurthy
  9.  * Copyright (C) 2017 Lakshmi Krishnamurthy
  10.  * Copyright (C) 2016 Lakshmi Krishnamurthy
  11.  *
  12.  *  This file is part of DROP, an open-source library targeting analytics/risk, transaction cost analytics,
  13.  *      asset liability management analytics, capital, exposure, and margin analytics, valuation adjustment
  14.  *      analytics, and portfolio construction analytics within and across fixed income, credit, commodity,
  15.  *      equity, FX, and structured products. It also includes auxiliary libraries for algorithm support,
  16.  *      numerical analysis, numerical optimization, spline builder, model validation, statistical learning,
  17.  *      and computational support.
  18.  *  
  19.  *      https://lakshmidrip.github.io/DROP/
  20.  *  
  21.  *  DROP is composed of three modules:
  22.  *  
  23.  *  - DROP Product Core - https://lakshmidrip.github.io/DROP-Product-Core/
  24.  *  - DROP Portfolio Core - https://lakshmidrip.github.io/DROP-Portfolio-Core/
  25.  *  - DROP Computational Core - https://lakshmidrip.github.io/DROP-Computational-Core/
  26.  *
  27.  *  DROP Product Core implements libraries for the following:
  28.  *  - Fixed Income Analytics
  29.  *  - Loan Analytics
  30.  *  - Transaction Cost Analytics
  31.  *
  32.  *  DROP Portfolio Core implements libraries for the following:
  33.  *  - Asset Allocation Analytics
  34.  *  - Asset Liability Management Analytics
  35.  *  - Capital Estimation Analytics
  36.  *  - Exposure Analytics
  37.  *  - Margin Analytics
  38.  *  - XVA Analytics
  39.  *
  40.  *  DROP Computational Core implements libraries for the following:
  41.  *  - Algorithm Support
  42.  *  - Computation Support
  43.  *  - Function Analysis
  44.  *  - Model Validation
  45.  *  - Numerical Analysis
  46.  *  - Numerical Optimizer
  47.  *  - Spline Builder
  48.  *  - Statistical Learning
  49.  *
  50.  *  Documentation for DROP is Spread Over:
  51.  *
  52.  *  - Main                     => https://lakshmidrip.github.io/DROP/
  53.  *  - Wiki                     => https://github.com/lakshmiDRIP/DROP/wiki
  54.  *  - GitHub                   => https://github.com/lakshmiDRIP/DROP
  55.  *  - Repo Layout Taxonomy     => https://github.com/lakshmiDRIP/DROP/blob/master/Taxonomy.md
  56.  *  - Javadoc                  => https://lakshmidrip.github.io/DROP/Javadoc/index.html
  57.  *  - Technical Specifications => https://github.com/lakshmiDRIP/DROP/tree/master/Docs/Internal
  58.  *  - Release Versions         => https://lakshmidrip.github.io/DROP/version.html
  59.  *  - Community Credits        => https://lakshmidrip.github.io/DROP/credits.html
  60.  *  - Issues Catalog           => https://github.com/lakshmiDRIP/DROP/issues
  61.  *  - JUnit                    => https://lakshmidrip.github.io/DROP/junit/index.html
  62.  *  - Jacoco                   => https://lakshmidrip.github.io/DROP/jacoco/index.html
  63.  *
  64.  *  Licensed under the Apache License, Version 2.0 (the "License");
  65.  *      you may not use this file except in compliance with the License.
  66.  *  
  67.  *  You may obtain a copy of the License at
  68.  *      http://www.apache.org/licenses/LICENSE-2.0
  69.  *  
  70.  *  Unless required by applicable law or agreed to in writing, software
  71.  *      distributed under the License is distributed on an "AS IS" BASIS,
  72.  *      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  73.  *  
  74.  *  See the License for the specific language governing permissions and
  75.  *      limitations under the License.
  76.  */

  77. /**
  78.  * <i>WolfeEvolutionVerifierMetrics</i> implements the Wolfe Criterion used for the Inexact Line Search
  79.  * Increment Generation. The References are:
  80.  * <br><br>
  81.  *  <ul>
  82.  *      <li>
  83.  *          Armijo, L. (1966): Minimization of Functions having Lipschitz-Continuous First Partial
  84.  *              Derivatives <i>Pacific Journal of Mathematics</i> <b>16 (1)</b> 1-3
  85.  *      </li>
  86.  *      <li>
  87.  *          Nocedal, J., and S. Wright (1999): <i>Numerical Optimization</i> <b>Wiley</b>
  88.  *      </li>
  89.  *      <li>
  90.  *          Wolfe, P. (1969): Convergence Conditions for Ascent Methods <i>SIAM Review</i> <b>11 (2)</b>
  91.  *              226-235
  92.  *      </li>
  93.  *      <li>
  94.  *          Wolfe, P. (1971): Convergence Conditions for Ascent Methods; II: Some Corrections <i>SIAM
  95.  *              Review</i> <b>13 (2)</b> 185-188
  96.  *      </li>
  97.  *  </ul>
  98.  *
  99.  *  <br><br>
  100.  *  <ul>
  101.  *      <li><b>Module </b> = <a href = "https://github.com/lakshmiDRIP/DROP/tree/master/ComputationalCore.md">Computational Core Module</a></li>
  102.  *      <li><b>Library</b> = <a href = "https://github.com/lakshmiDRIP/DROP/tree/master/NumericalAnalysisLibrary.md">Numerical Analysis Library</a></li>
  103.  *      <li><b>Project</b> = <a href = "https://github.com/lakshmiDRIP/DROP/tree/master/src/main/java/org/drip/function/README.md">R<sup>d</sup> To R<sup>d</sup> Function Analysis</a></li>
  104.  *      <li><b>Package</b> = <a href = "https://github.com/lakshmiDRIP/DROP/tree/master/src/main/java/org/drip/function/rdtor1descent/README.md">R<sup>d</sup> To R<sup>1</sup> Gradient Descent Techniques</a></li>
  105.  *  </ul>
  106.  *
  107.  * @author Lakshmi Krishnamurthy
  108.  */

  109. public class WolfeEvolutionVerifierMetrics
  110.     extends org.drip.function.rdtor1descent.LineEvolutionVerifierMetrics
  111. {
  112.     private boolean _maximizerCheck = false;
  113.     private boolean _strongCurvatureCriterion = false;
  114.     private double[] _nextVariateFunctionJacobian = null;
  115.     private double _armijoParameter = java.lang.Double.NaN;
  116.     private double _curvatureParameter = java.lang.Double.NaN;
  117.     private double _nextVariateFunctionValue = java.lang.Double.NaN;
  118.     private double _currentVariateFunctionValue = java.lang.Double.NaN;

  119.     /**
  120.      * WolfeEvolutionVerifierMetrics Constructor
  121.      *
  122.      * @param armijoParameter The Armijo Criterion Parameter
  123.      * @param maximizerCheck TRUE - Perform a Check for the Function Maxima
  124.      * @param curvatureParameter The Curvature Criterion Parameter
  125.      * @param strongCurvatureCriterion TRUE - Apply the "Strong" Curvature Criterion
  126.      * @param targetDirection The Target Direction Unit Vector
  127.      * @param currentVariateArray Array of the Current Variate
  128.      * @param stepLength The Incremental Step Length
  129.      * @param currentVariateFunctionValue The Function Value at the Current Variate
  130.      * @param nextVariateFunctionValue The Function Value at the Next Variate
  131.      * @param currentVariateFunctionJacobian The Function Jacobian at the Current Variate
  132.      * @param nextVariateFunctionJacobian The Function Jacobian at the Next Variate
  133.      *
  134.      * @throws java.lang.Exception Thrown if the Inputs are Invalid
  135.      */

  136.     public WolfeEvolutionVerifierMetrics (
  137.         final double armijoParameter,
  138.         final boolean maximizerCheck,
  139.         final double curvatureParameter,
  140.         final boolean strongCurvatureCriterion,
  141.         final org.drip.function.definition.UnitVector targetDirection,
  142.         final double[] currentVariateArray,
  143.         final double stepLength,
  144.         final double currentVariateFunctionValue,
  145.         final double nextVariateFunctionValue,
  146.         final double[] currentVariateFunctionJacobian,
  147.         final double[] nextVariateFunctionJacobian)
  148.         throws java.lang.Exception
  149.     {
  150.         super (
  151.             targetDirection,
  152.             currentVariateArray,
  153.             stepLength,
  154.             currentVariateFunctionJacobian
  155.         );

  156.         if (!org.drip.numerical.common.NumberUtil.IsValid (_armijoParameter = armijoParameter) ||
  157.             !org.drip.numerical.common.NumberUtil.IsValid (_curvatureParameter = curvatureParameter) ||
  158.             null == (_nextVariateFunctionJacobian = nextVariateFunctionJacobian) ||
  159.             !org.drip.numerical.common.NumberUtil.IsValid (_currentVariateFunctionValue =
  160.                 currentVariateFunctionValue) ||
  161.             !org.drip.numerical.common.NumberUtil.IsValid (_nextVariateFunctionValue =
  162.                 nextVariateFunctionValue) ||
  163.             currentVariateArray.length != _nextVariateFunctionJacobian.length)
  164.         {
  165.             throw new java.lang.Exception ("WolfeEvolutionVerifierMetrics Constructor => Invalid Inputs");
  166.         }

  167.         _maximizerCheck = maximizerCheck;
  168.         _strongCurvatureCriterion = strongCurvatureCriterion;
  169.     }

  170.     /**
  171.      * Retrieve the Armijo Parameter
  172.      *
  173.      * @return The Armijo Parameter
  174.      */

  175.     public double armijoParameter()
  176.     {
  177.         return _armijoParameter;
  178.     }

  179.     /**
  180.      * Indicate if the Check is for Minimizer/Maximizer
  181.      *
  182.      * @return TRUE - The Check is for Maximizer
  183.      */

  184.     public boolean maximizerCheck()
  185.     {
  186.         return _maximizerCheck;
  187.     }

  188.     /**
  189.      * Retrieve the Curvature Parameter
  190.      *
  191.      * @return The Curvature Parameter
  192.      */

  193.     public double curvatureParameter()
  194.     {
  195.         return _curvatureParameter;
  196.     }

  197.     /**
  198.      * Retrieve Whether of not the "Strong" Curvature Criterion needs to be met
  199.      *
  200.      * @return TRUE - The "Strong" Curvature Criterion needs to be met
  201.      */

  202.     public boolean strongCurvatureCriterion()
  203.     {
  204.         return _strongCurvatureCriterion;
  205.     }

  206.     /**
  207.      * Retrieve the Function Value at the Current Variate
  208.      *
  209.      * @return The Function Value at the Current Variate
  210.      */

  211.     public double currentVariateFunctionValue()
  212.     {
  213.         return _currentVariateFunctionValue;
  214.     }

  215.     /**
  216.      * Retrieve the Function Value at the Next Variate
  217.      *
  218.      * @return The Function Value at the Next Variate
  219.      */

  220.     public double nextVariateFunctionValue()
  221.     {
  222.         return _nextVariateFunctionValue;
  223.     }

  224.     /**
  225.      * Retrieve the Function Jacobian at the Next Variate
  226.      *
  227.      * @return The Function Jacobian at the Next Variate
  228.      */

  229.     public double[] nextVariateFunctionJacobian()
  230.     {
  231.         return _nextVariateFunctionJacobian;
  232.     }

  233.     /**
  234.      * Indicate if the Wolfe Criterion has been met
  235.      *
  236.      * @return TRUE - The Wolfe Criterion has been met
  237.      */

  238.     public boolean verify()
  239.     {
  240.         double[] targetDirectionVector = targetDirection().component();

  241.         double[] currentVariateFunctionJacobian = currentVariateFunctionJacobian();

  242.         try
  243.         {
  244.             double gradientUpdatedFunctionValue = _currentVariateFunctionValue +
  245.                 _armijoParameter * stepLength() * org.drip.numerical.linearalgebra.Matrix.DotProduct (
  246.                     targetDirectionVector,
  247.                     currentVariateFunctionJacobian
  248.                 );

  249.             if ((_maximizerCheck && _nextVariateFunctionValue < gradientUpdatedFunctionValue) ||
  250.                 (!_maximizerCheck && _nextVariateFunctionValue > gradientUpdatedFunctionValue))
  251.             {
  252.                 return false;
  253.             }

  254.             double nextFunctionIncrement = org.drip.numerical.linearalgebra.Matrix.DotProduct (
  255.                 targetDirectionVector,
  256.                 _nextVariateFunctionJacobian
  257.             );

  258.             double parametrizedCurrentFunctionIncrement = _curvatureParameter *
  259.                 org.drip.numerical.linearalgebra.Matrix.DotProduct (
  260.                     targetDirectionVector,
  261.                     currentVariateFunctionJacobian
  262.                 );

  263.             return _strongCurvatureCriterion ?
  264.                 java.lang.Math.abs (
  265.                     nextFunctionIncrement
  266.                 ) <= java.lang.Math.abs (
  267.                     parametrizedCurrentFunctionIncrement
  268.                 ) : nextFunctionIncrement >= parametrizedCurrentFunctionIncrement;
  269.         }
  270.         catch (java.lang.Exception e)
  271.         {
  272.             e.printStackTrace();
  273.         }

  274.         return false;
  275.     }
  276. }