RegularizerR1ContinuousToR1Continuous.java

  1. package org.drip.learning.regularization;

  2. /*
  3.  * -*- mode: java; tab-width: 4; indent-tabs-mode: nil; c-basic-offset: 4 -*-
  4.  */

  5. /*!
  6.  * Copyright (C) 2020 Lakshmi Krishnamurthy
  7.  * Copyright (C) 2019 Lakshmi Krishnamurthy
  8.  * Copyright (C) 2018 Lakshmi Krishnamurthy
  9.  * Copyright (C) 2017 Lakshmi Krishnamurthy
  10.  * Copyright (C) 2016 Lakshmi Krishnamurthy
  11.  * Copyright (C) 2015 Lakshmi Krishnamurthy
  12.  *
  13.  *  This file is part of DROP, an open-source library targeting analytics/risk, transaction cost analytics,
  14.  *      asset liability management analytics, capital, exposure, and margin analytics, valuation adjustment
  15.  *      analytics, and portfolio construction analytics within and across fixed income, credit, commodity,
  16.  *      equity, FX, and structured products. It also includes auxiliary libraries for algorithm support,
  17.  *      numerical analysis, numerical optimization, spline builder, model validation, statistical learning,
  18.  *      and computational support.
  19.  *  
  20.  *      https://lakshmidrip.github.io/DROP/
  21.  *  
  22.  *  DROP is composed of three modules:
  23.  *  
  24.  *  - DROP Product Core - https://lakshmidrip.github.io/DROP-Product-Core/
  25.  *  - DROP Portfolio Core - https://lakshmidrip.github.io/DROP-Portfolio-Core/
  26.  *  - DROP Computational Core - https://lakshmidrip.github.io/DROP-Computational-Core/
  27.  *
  28.  *  DROP Product Core implements libraries for the following:
  29.  *  - Fixed Income Analytics
  30.  *  - Loan Analytics
  31.  *  - Transaction Cost Analytics
  32.  *
  33.  *  DROP Portfolio Core implements libraries for the following:
  34.  *  - Asset Allocation Analytics
  35.  *  - Asset Liability Management Analytics
  36.  *  - Capital Estimation Analytics
  37.  *  - Exposure Analytics
  38.  *  - Margin Analytics
  39.  *  - XVA Analytics
  40.  *
  41.  *  DROP Computational Core implements libraries for the following:
  42.  *  - Algorithm Support
  43.  *  - Computation Support
  44.  *  - Function Analysis
  45.  *  - Model Validation
  46.  *  - Numerical Analysis
  47.  *  - Numerical Optimizer
  48.  *  - Spline Builder
  49.  *  - Statistical Learning
  50.  *
  51.  *  Documentation for DROP is Spread Over:
  52.  *
  53.  *  - Main                     => https://lakshmidrip.github.io/DROP/
  54.  *  - Wiki                     => https://github.com/lakshmiDRIP/DROP/wiki
  55.  *  - GitHub                   => https://github.com/lakshmiDRIP/DROP
  56.  *  - Repo Layout Taxonomy     => https://github.com/lakshmiDRIP/DROP/blob/master/Taxonomy.md
  57.  *  - Javadoc                  => https://lakshmidrip.github.io/DROP/Javadoc/index.html
  58.  *  - Technical Specifications => https://github.com/lakshmiDRIP/DROP/tree/master/Docs/Internal
  59.  *  - Release Versions         => https://lakshmidrip.github.io/DROP/version.html
  60.  *  - Community Credits        => https://lakshmidrip.github.io/DROP/credits.html
  61.  *  - Issues Catalog           => https://github.com/lakshmiDRIP/DROP/issues
  62.  *  - JUnit                    => https://lakshmidrip.github.io/DROP/junit/index.html
  63.  *  - Jacoco                   => https://lakshmidrip.github.io/DROP/jacoco/index.html
  64.  *
  65.  *  Licensed under the Apache License, Version 2.0 (the "License");
  66.  *      you may not use this file except in compliance with the License.
  67.  *  
  68.  *  You may obtain a copy of the License at
  69.  *      http://www.apache.org/licenses/LICENSE-2.0
  70.  *  
  71.  *  Unless required by applicable law or agreed to in writing, software
  72.  *      distributed under the License is distributed on an "AS IS" BASIS,
  73.  *      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  74.  *  
  75.  *  See the License for the specific language governing permissions and
  76.  *      limitations under the License.
  77.  */

  78. /**
  79.  * <i>RegularizerR1ContinuousToR1Continuous</i> computes the Structural Loss and Risk for the specified
  80.  * Normed R<sup>1</sup> Continuous To Normed R<sup>1</sup> Continuous Learning Function.
  81.  *  
  82.  * <br><br>
  83.  * <ul>
  84.  *  <li>
  85.  *      Alon, N., S. Ben-David, N. Cesa Bianchi, and D. Haussler (1997): Scale-sensitive Dimensions, Uniform
  86.  *          Convergence, and Learnability <i>Journal of Association of Computational Machinery</i> <b>44
  87.  *          (4)</b> 615-631
  88.  *  </li>
  89.  *  <li>
  90.  *      Anthony, M., and P. L. Bartlett (1999): <i>Artificial Neural Network Learning - Theoretical
  91.  *          Foundations</i> <b>Cambridge University Press</b> Cambridge, UK
  92.  *  </li>
  93.  *  <li>
  94.  *      Kearns, M. J., R. E. Schapire, and L. M. Sellie (1994): <i>Towards Efficient Agnostic Learning</i>
  95.  *          Machine Learning <b>17 (2)</b> 115-141
  96.  *  </li>
  97.  *  <li>
  98.  *      Lee, W. S., P. L. Bartlett, and R. C. Williamson (1998): The Importance of Convexity in Learning with
  99.  *          Squared Loss <i>IEEE Transactions on Information Theory</i> <b>44</b> 1974-1980
  100.  *  </li>
  101.  *  <li>
  102.  *      Vapnik, V. N. (1998): <i>Statistical learning Theory</i> <b>Wiley</b> New York
  103.  *  </li>
  104.  * </ul>
  105.  *
  106.  *  <br><br>
  107.  *  <ul>
  108.  *      <li><b>Module </b> = <a href = "https://github.com/lakshmiDRIP/DROP/tree/master/ComputationalCore.md">Computational Core Module</a></li>
  109.  *      <li><b>Library</b> = <a href = "https://github.com/lakshmiDRIP/DROP/tree/master/StatisticalLearningLibrary.md">Statistical Learning</a></li>
  110.  *      <li><b>Project</b> = <a href = "https://github.com/lakshmiDRIP/DROP/tree/master/src/main/java/org/drip/learning">Agnostic Learning Bounds under Empirical Loss Minimization Schemes</a></li>
  111.  *      <li><b>Package</b> = <a href = "https://github.com/lakshmiDRIP/DROP/tree/master/src/main/java/org/drip/learning/regularization">Statistical Learning Empirical Loss Regularizer</a></li>
  112.  *  </ul>
  113.  *
  114.  * @author Lakshmi Krishnamurthy
  115.  */

  116. public class RegularizerR1ContinuousToR1Continuous extends
  117.     org.drip.spaces.rxtor1.NormedR1ContinuousToR1Continuous implements
  118.         org.drip.learning.regularization.RegularizerR1ToR1 {
  119.     private double _dblLambda = java.lang.Double.NaN;

  120.     /**
  121.      * RegularizerR1ContinuousToR1Continuous Function Space Constructor
  122.      *
  123.      * @param funcRegularizerR1ToR1 The R^1 To R^1 Regularizer Function
  124.      * @param r1ContinuousInput The Continuous R^1 Input Metric Vector Space
  125.      * @param r1ContinuousOutput The Continuous R^1 Output Metric Vector Space
  126.      * @param dblLambda The Regularization Lambda
  127.      *
  128.      * @throws java.lang.Exception Thrown if the Inputs are Invalid
  129.      */

  130.     public RegularizerR1ContinuousToR1Continuous (
  131.         final org.drip.function.definition.R1ToR1 funcRegularizerR1ToR1,
  132.         final org.drip.spaces.metric.R1Continuous r1ContinuousInput,
  133.         final org.drip.spaces.metric.R1Continuous r1ContinuousOutput,
  134.         final double dblLambda)
  135.         throws java.lang.Exception
  136.     {
  137.         super (r1ContinuousInput, r1ContinuousOutput, funcRegularizerR1ToR1);

  138.         if (!org.drip.numerical.common.NumberUtil.IsValid (_dblLambda = dblLambda) || 0 > _dblLambda)
  139.             throw new java.lang.Exception
  140.                 ("RegularizerR1ContinuousToR1Continuous Constructor => Invalid Inputs");
  141.     }

  142.     @Override public double lambda()
  143.     {
  144.         return _dblLambda;
  145.     }

  146.     @Override public double structuralLoss (
  147.         final org.drip.function.definition.R1ToR1 funcR1ToR1,
  148.         final double[] adblX)
  149.         throws java.lang.Exception
  150.     {
  151.         if (null == funcR1ToR1 || null == adblX)
  152.             throw new java.lang.Exception
  153.                 ("RegularizerR1ContinuousToR1Continuous::structuralLoss => Invalid Inputs");

  154.         double dblLoss = 0.;
  155.         int iNumSample = adblX.length;

  156.         if (0 == iNumSample)
  157.             throw new java.lang.Exception
  158.                 ("RegularizerR1ContinuousToR1Continuous::structuralLoss => Invalid Inputs");

  159.         org.drip.function.definition.R1ToR1 funcRegularizerR1ToR1 = function();

  160.         int iPNorm = outputMetricVectorSpace().pNorm();

  161.         if (java.lang.Integer.MAX_VALUE == iPNorm) {
  162.             double dblSupremum = 0.;

  163.             for (int i = 0; i < iNumSample; ++i) {
  164.                 double dblNodeValue = java.lang.Math.abs (funcRegularizerR1ToR1.evaluate (adblX[i]) *
  165.                     funcR1ToR1.evaluate (adblX[i]));

  166.                 if (dblSupremum < dblNodeValue) dblSupremum = dblNodeValue;
  167.             }

  168.             return dblSupremum;
  169.         }

  170.         for (int i = 0; i < iNumSample; ++i)
  171.             dblLoss += java.lang.Math.pow (java.lang.Math.abs (funcRegularizerR1ToR1.evaluate (adblX[i]) *
  172.                 funcR1ToR1.evaluate (adblX[i])), iPNorm);

  173.         return dblLoss / iPNorm;
  174.     }

  175.     @Override public double structuralRisk (
  176.         final org.drip.measure.continuous.R1R1 distR1R1,
  177.         final org.drip.function.definition.R1ToR1 funcR1ToR1,
  178.         final double[] adblX,
  179.         final double[] adblY)
  180.         throws java.lang.Exception
  181.     {
  182.         if (null == funcR1ToR1 || null == adblX || null == adblY)
  183.             throw new java.lang.Exception
  184.                 ("RegularizerR1ContinuousToR1Continuous::structuralRisk => Invalid Inputs");

  185.         double dblLoss = 0.;
  186.         double dblNormalizer = 0.;
  187.         int iNumSample = adblX.length;

  188.         if (0 == iNumSample || iNumSample != adblY.length)
  189.             throw new java.lang.Exception
  190.                 ("RegularizerR1ContinuousToR1Continuous::structuralRisk => Invalid Inputs");

  191.         int iPNorm = outputMetricVectorSpace().pNorm();

  192.         org.drip.function.definition.R1ToR1 funcRegularizerR1ToR1 = function();

  193.         if (java.lang.Integer.MAX_VALUE == iPNorm) {
  194.             double dblWeightedSupremum = 0.;
  195.             double dblSupremumNodeValue = 0.;

  196.             for (int i = 0; i < iNumSample; ++i) {
  197.                 double dblNodeValue = java.lang.Math.abs (funcRegularizerR1ToR1.evaluate (adblX[i]) *
  198.                     funcR1ToR1.evaluate (adblX[i]));

  199.                 double dblWeightedNodeValue = distR1R1.density (adblX[i], adblY[i]) * dblNodeValue;

  200.                 if (dblWeightedNodeValue > dblWeightedSupremum) {
  201.                     dblSupremumNodeValue = dblNodeValue;
  202.                     dblWeightedSupremum = dblWeightedNodeValue;
  203.                 }
  204.             }

  205.             return dblSupremumNodeValue;
  206.         }

  207.         for (int i = 0; i < iNumSample; ++i) {
  208.             double dblDensity = distR1R1.density (adblX[i], adblY[i]);

  209.             dblNormalizer += dblDensity;

  210.             dblLoss += dblDensity * java.lang.Math.pow (java.lang.Math.abs (funcRegularizerR1ToR1.evaluate
  211.                 (adblX[i]) * funcR1ToR1.evaluate (adblX[i])), iPNorm);
  212.         }

  213.         return dblLoss / iPNorm / dblNormalizer;
  214.     }
  215. }