GeneralizedLearner.java

  1. package org.drip.learning.rxtor1;

  2. /*
  3.  * -*- mode: java; tab-width: 4; indent-tabs-mode: nil; c-basic-offset: 4 -*-
  4.  */

  5. /*!
  6.  * Copyright (C) 2020 Lakshmi Krishnamurthy
  7.  * Copyright (C) 2019 Lakshmi Krishnamurthy
  8.  * Copyright (C) 2018 Lakshmi Krishnamurthy
  9.  * Copyright (C) 2017 Lakshmi Krishnamurthy
  10.  * Copyright (C) 2016 Lakshmi Krishnamurthy
  11.  * Copyright (C) 2015 Lakshmi Krishnamurthy
  12.  *
  13.  *  This file is part of DROP, an open-source library targeting analytics/risk, transaction cost analytics,
  14.  *      asset liability management analytics, capital, exposure, and margin analytics, valuation adjustment
  15.  *      analytics, and portfolio construction analytics within and across fixed income, credit, commodity,
  16.  *      equity, FX, and structured products. It also includes auxiliary libraries for algorithm support,
  17.  *      numerical analysis, numerical optimization, spline builder, model validation, statistical learning,
  18.  *      and computational support.
  19.  *  
  20.  *      https://lakshmidrip.github.io/DROP/
  21.  *  
  22.  *  DROP is composed of three modules:
  23.  *  
  24.  *  - DROP Product Core - https://lakshmidrip.github.io/DROP-Product-Core/
  25.  *  - DROP Portfolio Core - https://lakshmidrip.github.io/DROP-Portfolio-Core/
  26.  *  - DROP Computational Core - https://lakshmidrip.github.io/DROP-Computational-Core/
  27.  *
  28.  *  DROP Product Core implements libraries for the following:
  29.  *  - Fixed Income Analytics
  30.  *  - Loan Analytics
  31.  *  - Transaction Cost Analytics
  32.  *
  33.  *  DROP Portfolio Core implements libraries for the following:
  34.  *  - Asset Allocation Analytics
  35.  *  - Asset Liability Management Analytics
  36.  *  - Capital Estimation Analytics
  37.  *  - Exposure Analytics
  38.  *  - Margin Analytics
  39.  *  - XVA Analytics
  40.  *
  41.  *  DROP Computational Core implements libraries for the following:
  42.  *  - Algorithm Support
  43.  *  - Computation Support
  44.  *  - Function Analysis
  45.  *  - Model Validation
  46.  *  - Numerical Analysis
  47.  *  - Numerical Optimizer
  48.  *  - Spline Builder
  49.  *  - Statistical Learning
  50.  *
  51.  *  Documentation for DROP is Spread Over:
  52.  *
  53.  *  - Main                     => https://lakshmidrip.github.io/DROP/
  54.  *  - Wiki                     => https://github.com/lakshmiDRIP/DROP/wiki
  55.  *  - GitHub                   => https://github.com/lakshmiDRIP/DROP
  56.  *  - Repo Layout Taxonomy     => https://github.com/lakshmiDRIP/DROP/blob/master/Taxonomy.md
  57.  *  - Javadoc                  => https://lakshmidrip.github.io/DROP/Javadoc/index.html
  58.  *  - Technical Specifications => https://github.com/lakshmiDRIP/DROP/tree/master/Docs/Internal
  59.  *  - Release Versions         => https://lakshmidrip.github.io/DROP/version.html
  60.  *  - Community Credits        => https://lakshmidrip.github.io/DROP/credits.html
  61.  *  - Issues Catalog           => https://github.com/lakshmiDRIP/DROP/issues
  62.  *  - JUnit                    => https://lakshmidrip.github.io/DROP/junit/index.html
  63.  *  - Jacoco                   => https://lakshmidrip.github.io/DROP/jacoco/index.html
  64.  *
  65.  *  Licensed under the Apache License, Version 2.0 (the "License");
  66.  *      you may not use this file except in compliance with the License.
  67.  *  
  68.  *  You may obtain a copy of the License at
  69.  *      http://www.apache.org/licenses/LICENSE-2.0
  70.  *  
  71.  *  Unless required by applicable law or agreed to in writing, software
  72.  *      distributed under the License is distributed on an "AS IS" BASIS,
  73.  *      WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  74.  *  
  75.  *  See the License for the specific language governing permissions and
  76.  *      limitations under the License.
  77.  */

  78. /**
  79.  * <i>GeneralizedLearner</i> implements the Learner Class that holds the Space of Normed R<sup>x</sup> To
  80.  * Normed R<sup>1</sup> Learning Functions along with their Custom Empirical Loss. Class-Specific Asymptotic
  81.  * Sample, Covering Number based Upper Probability Bounds and other Parameters are also maintained.
  82.  *  
  83.  * <br><br>
  84.  * The References are:
  85.  *  
  86.  * <br><br>
  87.  * <ul>
  88.  *  <li>
  89.  *      Alon, N., S. Ben-David, N. Cesa Bianchi, and D. Haussler (1997): Scale-sensitive Dimensions, Uniform
  90.  *          Convergence, and Learnability <i>Journal of Association of Computational Machinery</i> <b>44
  91.  *          (4)</b> 615-631
  92.  *  </li>
  93.  *  <li>
  94.  *      Anthony, M., and P. L. Bartlett (1999): <i>Artificial Neural Network Learning - Theoretical
  95.  *          Foundations</i> <b>Cambridge University Press</b> Cambridge, UK
  96.  *  </li>
  97.  *  <li>
  98.  *      Kearns, M. J., R. E. Schapire, and L. M. Sellie (1994): <i>Towards Efficient Agnostic Learning</i>
  99.  *          Machine Learning <b>17 (2)</b> 115-141
  100.  *  </li>
  101.  *  <li>
  102.  *      Lee, W. S., P. L. Bartlett, and R. C. Williamson (1998): The Importance of Convexity in Learning with
  103.  *          Squared Loss <i>IEEE Transactions on Information Theory</i> <b>44</b> 1974-1980
  104.  *  </li>
  105.  *  <li>
  106.  *      Vapnik, V. N. (1998): <i>Statistical learning Theory</i> <b>Wiley</b> New York
  107.  *  </li>
  108.  * </ul>
  109.  *
  110.  *  <br><br>
  111.  *  <ul>
  112.  *      <li><b>Module </b> = <a href = "https://github.com/lakshmiDRIP/DROP/tree/master/ComputationalCore.md">Computational Core Module</a></li>
  113.  *      <li><b>Library</b> = <a href = "https://github.com/lakshmiDRIP/DROP/tree/master/StatisticalLearningLibrary.md">Statistical Learning</a></li>
  114.  *      <li><b>Project</b> = <a href = "https://github.com/lakshmiDRIP/DROP/tree/master/src/main/java/org/drip/learning">Agnostic Learning Bounds under Empirical Loss Minimization Schemes</a></li>
  115.  *      <li><b>Package</b> = <a href = "https://github.com/lakshmiDRIP/DROP/tree/master/src/main/java/org/drip/learning/rxtor1">Statistical Learning Empirical Loss Penalizer</a></li>
  116.  *  </ul>
  117.  *
  118.  * @author Lakshmi Krishnamurthy
  119.  */

  120. public abstract class GeneralizedLearner implements org.drip.learning.rxtor1.EmpiricalLearningMetricEstimator
  121. {
  122.     private org.drip.learning.bound.CoveringNumberLossBound _funcClassCNLB = null;
  123.     private org.drip.spaces.functionclass.NormedRxToNormedR1Finite _funcClassRxToR1 = null;
  124.     private org.drip.learning.regularization.RegularizationFunction _regularizerFunc = null;

  125.     /**
  126.      * GeneralizedLearner Constructor
  127.      *
  128.      * @param funcClassRxToR1 R^x To R^1 Function Class
  129.      * @param funcClassCNLB The Function Class Covering Number based Deviation Upper Probability Bound
  130.      *  Generator
  131.      * @param regularizerFunc The Regularizer Function
  132.      *
  133.      * @throws java.lang.Exception Thrown if the Inputs are Invalid
  134.      */

  135.     public GeneralizedLearner (
  136.         final org.drip.spaces.functionclass.NormedRxToNormedR1Finite funcClassRxToR1,
  137.         final org.drip.learning.bound.CoveringNumberLossBound funcClassCNLB,
  138.         final org.drip.learning.regularization.RegularizationFunction regularizerFunc)
  139.         throws java.lang.Exception
  140.     {
  141.         if (null == (_funcClassRxToR1 = funcClassRxToR1) || null == (_funcClassCNLB = funcClassCNLB) || null
  142.             == (_regularizerFunc = regularizerFunc))
  143.             throw new java.lang.Exception ("GeneralizedLearner ctr: Invalid Inputs");
  144.     }

  145.     @Override public org.drip.spaces.functionclass.NormedRxToNormedR1Finite functionClass()
  146.     {
  147.         return _funcClassRxToR1;
  148.     }

  149.     @Override public org.drip.learning.regularization.RegularizationFunction regularizerFunction()
  150.     {
  151.         return _regularizerFunc;
  152.     }

  153.     @Override public org.drip.learning.rxtor1.EmpiricalPenaltySupremum supremumEmpiricalLoss (
  154.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX,
  155.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviY)
  156.     {
  157.         try {
  158.             return new org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator
  159.                 (org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator.SUPREMUM_PENALTY_EMPIRICAL_LOSS,
  160.                     this, gvviY, null, null).supremum (gvviX);
  161.         } catch (java.lang.Exception e) {
  162.             e.printStackTrace();
  163.         }

  164.         return null;
  165.     }

  166.     @Override public double structuralLoss (
  167.         final org.drip.function.definition.R1ToR1 funcLearnerR1ToR1,
  168.         final org.drip.spaces.instance.GeneralizedValidatedVector gvvi)
  169.         throws java.lang.Exception
  170.     {
  171.         if (null == gvvi || !(gvvi instanceof org.drip.spaces.instance.ValidatedR1) &&
  172.             (_funcClassRxToR1 instanceof org.drip.spaces.functionclass.NormedR1ToNormedR1Finite))
  173.             throw new java.lang.Exception ("GeneralizedLearner::structuralLoss => Invalid Inputs");

  174.         org.drip.function.definition.R1ToR1 funcRegularizerR1ToR1 = _regularizerFunc.r1Tor1();

  175.         if (null == funcRegularizerR1ToR1)
  176.             throw new java.lang.Exception ("GeneralizedLearner::structuralLoss => Invalid Inputs");

  177.         org.drip.spaces.functionclass.NormedR1ToNormedR1Finite finiteClassR1ToR1 =
  178.             (org.drip.spaces.functionclass.NormedR1ToNormedR1Finite) _funcClassRxToR1;

  179.         org.drip.spaces.metric.GeneralizedMetricVectorSpace gmvsInput =
  180.             finiteClassR1ToR1.inputMetricVectorSpace();

  181.         if (gmvsInput instanceof org.drip.spaces.metric.R1Normed)
  182.             throw new java.lang.Exception ("GeneralizedLearner::structuralLoss => Invalid Inputs");

  183.         org.drip.spaces.metric.GeneralizedMetricVectorSpace gmvsOutput =
  184.             finiteClassR1ToR1.outputMetricVectorSpace();

  185.         if (gmvsOutput instanceof org.drip.spaces.metric.R1Continuous)
  186.             throw new java.lang.Exception ("GeneralizedLearner::structuralLoss => Invalid Inputs");

  187.         org.drip.learning.regularization.RegularizerR1ToR1 regularizerR1ToR1 =
  188.             org.drip.learning.regularization.RegularizerBuilder.ToR1Continuous (funcRegularizerR1ToR1,
  189.                 (org.drip.spaces.metric.R1Normed) gmvsInput, (org.drip.spaces.metric.R1Continuous)
  190.                     gmvsOutput, _regularizerFunc.lambda());

  191.         if (null == regularizerR1ToR1)
  192.             throw new java.lang.Exception ("GeneralizedLearner::structuralLoss => Invalid Inputs");

  193.         return regularizerR1ToR1.structuralLoss (funcLearnerR1ToR1, ((org.drip.spaces.instance.ValidatedR1)
  194.             gvvi).instance());
  195.     }

  196.     @Override public double structuralLoss (
  197.         final org.drip.function.definition.RdToR1 funcLearnerRdToR1,
  198.         final org.drip.spaces.instance.GeneralizedValidatedVector gvvi)
  199.         throws java.lang.Exception
  200.     {
  201.         if (null == gvvi || !(gvvi instanceof org.drip.spaces.instance.ValidatedRd) &&
  202.             (_funcClassRxToR1 instanceof org.drip.spaces.functionclass.NormedRdToNormedR1Finite))
  203.             throw new java.lang.Exception ("GeneralizedLearner::structuralLoss => Invalid Inputs");

  204.         org.drip.function.definition.RdToR1 funcRegularizerRdToR1 = _regularizerFunc.rdTor1();

  205.         if (null == funcRegularizerRdToR1)
  206.             throw new java.lang.Exception ("GeneralizedLearner::structuralLoss => Invalid Inputs");

  207.         org.drip.spaces.functionclass.NormedRdToNormedR1Finite finiteClassRdToR1 =
  208.             (org.drip.spaces.functionclass.NormedRdToNormedR1Finite) _funcClassRxToR1;

  209.         org.drip.spaces.metric.GeneralizedMetricVectorSpace gmvsInput =
  210.             finiteClassRdToR1.inputMetricVectorSpace();

  211.         if (gmvsInput instanceof org.drip.spaces.metric.RdNormed)
  212.             throw new java.lang.Exception ("GeneralizedLearner::structuralLoss => Invalid Inputs");

  213.         org.drip.spaces.metric.GeneralizedMetricVectorSpace gmvsOutput =
  214.             finiteClassRdToR1.outputMetricVectorSpace();

  215.         if (gmvsOutput instanceof org.drip.spaces.metric.R1Continuous)
  216.             throw new java.lang.Exception ("GeneralizedLearner::structuralLoss => Invalid Inputs");

  217.         org.drip.learning.regularization.RegularizerRdToR1 regularizerRdToR1 =
  218.             org.drip.learning.regularization.RegularizerBuilder.ToRdContinuous (funcRegularizerRdToR1,
  219.                 (org.drip.spaces.metric.RdNormed) gmvsInput, (org.drip.spaces.metric.R1Continuous)
  220.                     gmvsOutput, _regularizerFunc.lambda());

  221.         if (null == regularizerRdToR1)
  222.             throw new java.lang.Exception ("GeneralizedLearner::structuralLoss => Invalid Inputs");

  223.         return regularizerRdToR1.structuralLoss (funcLearnerRdToR1, ((org.drip.spaces.instance.ValidatedRd)
  224.             gvvi).instance());
  225.     }

  226.     @Override public org.drip.learning.rxtor1.EmpiricalPenaltySupremum supremumStructuralLoss (
  227.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX)
  228.     {
  229.         try {
  230.             return new org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator
  231.                 (org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator.SUPREMUM_PENALTY_STRUCTURAL_LOSS,
  232.                     this, null, null, null).supremum (gvviX);
  233.         } catch (java.lang.Exception e) {
  234.             e.printStackTrace();
  235.         }

  236.         return null;
  237.     }

  238.     @Override public double regularizedLoss (
  239.         final org.drip.function.definition.R1ToR1 funcLearnerR1ToR1,
  240.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX,
  241.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviY)
  242.         throws java.lang.Exception
  243.     {
  244.         return empiricalLoss (funcLearnerR1ToR1, gvviX, gvviY) + structuralLoss (funcLearnerR1ToR1, gvviX);
  245.     }

  246.     @Override public double regularizedLoss (
  247.         final org.drip.function.definition.RdToR1 funcLearnerRdToR1,
  248.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX,
  249.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviY)
  250.         throws java.lang.Exception
  251.     {
  252.         return empiricalLoss (funcLearnerRdToR1, gvviX, gvviY) + structuralLoss (funcLearnerRdToR1, gvviX);
  253.     }

  254.     @Override public org.drip.learning.rxtor1.EmpiricalPenaltySupremum supremumRegularizedLoss (
  255.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX,
  256.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviY)
  257.     {
  258.         try {
  259.             return new org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator
  260.                 (org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator.SUPREMUM_PENALTY_REGULARIZED_LOSS,
  261.                 this, gvviY, null, null).supremum (gvviX);
  262.         } catch (java.lang.Exception e) {
  263.             e.printStackTrace();
  264.         }

  265.         return null;
  266.     }

  267.     @Override public org.drip.learning.rxtor1.EmpiricalPenaltySupremum supremumEmpiricalRisk (
  268.         final org.drip.measure.continuous.R1R1 distR1R1,
  269.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX,
  270.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviY)
  271.     {
  272.         try {
  273.             return new org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator
  274.                 (org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator.SUPREMUM_PENALTY_EMPIRICAL_RISK,
  275.                     this, gvviY, distR1R1, null).supremum (gvviX);
  276.         } catch (java.lang.Exception e) {
  277.             e.printStackTrace();
  278.         }

  279.         return null;
  280.     }

  281.     @Override public org.drip.learning.rxtor1.EmpiricalPenaltySupremum supremumEmpiricalRisk (
  282.         final org.drip.measure.continuous.RdR1 distRdR1,
  283.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX,
  284.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviY)
  285.     {
  286.         try {
  287.             return new org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator
  288.                 (org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator.SUPREMUM_PENALTY_EMPIRICAL_RISK,
  289.                     this, gvviY, null, distRdR1).supremum (gvviX);
  290.         } catch (java.lang.Exception e) {
  291.             e.printStackTrace();
  292.         }

  293.         return null;
  294.     }

  295.     @Override public double structuralRisk (
  296.         final org.drip.measure.continuous.R1R1 distR1R1,
  297.         final org.drip.function.definition.R1ToR1 funcLearnerR1ToR1,
  298.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX,
  299.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviY)
  300.         throws java.lang.Exception
  301.     {
  302.         if (null == distR1R1 || null == gvviX || null == gvviY || !(gvviX instanceof
  303.             org.drip.spaces.instance.ValidatedR1) || !(gvviY instanceof
  304.                 org.drip.spaces.instance.ValidatedR1) && !(_funcClassRxToR1 instanceof
  305.                     org.drip.spaces.functionclass.NormedR1ToNormedR1Finite))
  306.             throw new java.lang.Exception ("GeneralizedLearner::structuralRisk => Invalid Inputs");

  307.         org.drip.function.definition.R1ToR1 funcRegularizerR1ToR1 = _regularizerFunc.r1Tor1();

  308.         if (null == funcRegularizerR1ToR1)
  309.             throw new java.lang.Exception ("GeneralizedLearner::structuralRisk => Invalid Inputs");

  310.         org.drip.spaces.functionclass.NormedR1ToNormedR1Finite finiteClassR1ToR1 =
  311.             (org.drip.spaces.functionclass.NormedR1ToNormedR1Finite) _funcClassRxToR1;

  312.         org.drip.spaces.metric.GeneralizedMetricVectorSpace gmvsInput =
  313.             finiteClassR1ToR1.inputMetricVectorSpace();

  314.         if (gmvsInput instanceof org.drip.spaces.metric.R1Normed)
  315.             throw new java.lang.Exception ("GeneralizedLearner::structuralRisk => Invalid Inputs");

  316.         org.drip.spaces.metric.GeneralizedMetricVectorSpace gmvsOutput =
  317.             finiteClassR1ToR1.outputMetricVectorSpace();

  318.         if (gmvsOutput instanceof org.drip.spaces.metric.R1Continuous)
  319.             throw new java.lang.Exception ("GeneralizedLearner::structuralRisk => Invalid Inputs");

  320.         org.drip.learning.regularization.RegularizerR1ToR1 regularizerR1ToR1 =
  321.             org.drip.learning.regularization.RegularizerBuilder.ToR1Continuous (funcRegularizerR1ToR1,
  322.                 (org.drip.spaces.metric.R1Normed) gmvsInput, (org.drip.spaces.metric.R1Continuous)
  323.                     gmvsOutput, _regularizerFunc.lambda());

  324.         if (null == regularizerR1ToR1)
  325.             throw new java.lang.Exception ("GeneralizedLearner::structuralRisk => Invalid Inputs");

  326.         return regularizerR1ToR1.structuralRisk (distR1R1, funcLearnerR1ToR1,
  327.             ((org.drip.spaces.instance.ValidatedR1) gvviX).instance(),
  328.                 ((org.drip.spaces.instance.ValidatedR1) gvviY).instance());
  329.     }

  330.     @Override public double structuralRisk (
  331.         final org.drip.measure.continuous.RdR1 distRdR1,
  332.         final org.drip.function.definition.RdToR1 funcLearnerRdToR1,
  333.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX,
  334.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviY)
  335.         throws java.lang.Exception
  336.     {
  337.         if (null == distRdR1 || null == gvviX || null == gvviY || !(gvviX instanceof
  338.             org.drip.spaces.instance.ValidatedRd) || !(gvviY instanceof
  339.                 org.drip.spaces.instance.ValidatedR1) && !(_funcClassRxToR1 instanceof
  340.                     org.drip.spaces.functionclass.NormedR1ToNormedR1Finite))
  341.             throw new java.lang.Exception ("GeneralizedLearner::structuralRisk => Invalid Inputs");

  342.         org.drip.function.definition.RdToR1 funcRegularizerRdToR1 = _regularizerFunc.rdTor1();

  343.         if (null == funcRegularizerRdToR1)
  344.             throw new java.lang.Exception ("GeneralizedLearner::structuralRisk => Invalid Inputs");

  345.         org.drip.spaces.functionclass.NormedRdToNormedR1Finite finiteClassRdToR1 =
  346.             (org.drip.spaces.functionclass.NormedRdToNormedR1Finite) _funcClassRxToR1;

  347.         org.drip.spaces.metric.GeneralizedMetricVectorSpace gmvsInput =
  348.             finiteClassRdToR1.inputMetricVectorSpace();

  349.         if (gmvsInput instanceof org.drip.spaces.metric.RdNormed)
  350.             throw new java.lang.Exception ("GeneralizedLearner::structuralRisk => Invalid Inputs");

  351.         org.drip.spaces.metric.GeneralizedMetricVectorSpace gmvsOutput =
  352.             finiteClassRdToR1.outputMetricVectorSpace();

  353.         if (gmvsOutput instanceof org.drip.spaces.metric.R1Continuous)
  354.             throw new java.lang.Exception ("GeneralizedLearner::structuralRisk => Invalid Inputs");

  355.         org.drip.learning.regularization.RegularizerRdToR1 regularizerRdToR1 =
  356.             org.drip.learning.regularization.RegularizerBuilder.ToRdContinuous (funcRegularizerRdToR1,
  357.                 (org.drip.spaces.metric.RdNormed) gmvsInput, (org.drip.spaces.metric.R1Continuous)
  358.                     gmvsOutput, _regularizerFunc.lambda());

  359.         if (null == regularizerRdToR1)
  360.             throw new java.lang.Exception ("GeneralizedLearner::structuralRisk => Invalid Inputs");

  361.         return regularizerRdToR1.structuralRisk (distRdR1, funcLearnerRdToR1,
  362.             ((org.drip.spaces.instance.ValidatedRd) gvviX).instance(),
  363.                 ((org.drip.spaces.instance.ValidatedR1) gvviY).instance());
  364.     }

  365.     @Override public org.drip.learning.rxtor1.EmpiricalPenaltySupremum supremumStructuralRisk (
  366.         final org.drip.measure.continuous.R1R1 distR1R1,
  367.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX,
  368.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviY)
  369.     {
  370.         try {
  371.             return new org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator
  372.                 (org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator.SUPREMUM_PENALTY_STRUCTURAL_RISK,
  373.                     this, gvviY, distR1R1, null).supremum (gvviX);
  374.         } catch (java.lang.Exception e) {
  375.             e.printStackTrace();
  376.         }

  377.         return null;
  378.     }

  379.     @Override public org.drip.learning.rxtor1.EmpiricalPenaltySupremum supremumStructuralRisk (
  380.         final org.drip.measure.continuous.RdR1 distRdR1,
  381.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX,
  382.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviY)
  383.     {
  384.         try {
  385.             return new org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator
  386.                 (org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator.SUPREMUM_PENALTY_STRUCTURAL_RISK,
  387.                     this, gvviY, null, distRdR1).supremum (gvviX);
  388.         } catch (java.lang.Exception e) {
  389.             e.printStackTrace();
  390.         }

  391.         return null;
  392.     }

  393.     @Override public double regularizedRisk (
  394.         final org.drip.measure.continuous.R1R1 distR1R1,
  395.         final org.drip.function.definition.R1ToR1 funcLearnerR1ToR1,
  396.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX,
  397.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviY)
  398.         throws java.lang.Exception
  399.     {
  400.         return empiricalRisk (distR1R1, funcLearnerR1ToR1, gvviX, gvviY) + structuralRisk (distR1R1,
  401.             funcLearnerR1ToR1, gvviX, gvviY);
  402.     }

  403.     @Override public double regularizedRisk (
  404.         final org.drip.measure.continuous.RdR1 distRdR1,
  405.         final org.drip.function.definition.RdToR1 funcLearnerRdToR1,
  406.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX,
  407.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviY)
  408.         throws java.lang.Exception
  409.     {
  410.         return empiricalRisk (distRdR1, funcLearnerRdToR1, gvviX, gvviY) + structuralRisk (distRdR1,
  411.             funcLearnerRdToR1, gvviX, gvviY);
  412.     }

  413.     @Override public org.drip.learning.rxtor1.EmpiricalPenaltySupremum supremumRegularizedRisk (
  414.         final org.drip.measure.continuous.R1R1 distR1R1,
  415.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX,
  416.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviY)
  417.     {
  418.         try {
  419.             return new org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator
  420.                 (org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator.SUPREMUM_PENALTY_REGULARIZED_RISK,
  421.                 this, gvviY, distR1R1, null).supremum (gvviX);
  422.         } catch (java.lang.Exception e) {
  423.             e.printStackTrace();
  424.         }

  425.         return null;
  426.     }

  427.     @Override public org.drip.learning.rxtor1.EmpiricalPenaltySupremum supremumRegularizedRisk (
  428.         final org.drip.measure.continuous.RdR1 distRdR1,
  429.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviX,
  430.         final org.drip.spaces.instance.GeneralizedValidatedVector gvviY)
  431.     {
  432.         try {
  433.             return new org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator
  434.                 (org.drip.learning.rxtor1.EmpiricalPenaltySupremumEstimator.SUPREMUM_PENALTY_REGULARIZED_RISK,
  435.                 this, gvviY, null, distRdR1).supremum (gvviX);
  436.         } catch (java.lang.Exception e) {
  437.             e.printStackTrace();
  438.         }

  439.         return null;
  440.     }

  441.     /**
  442.      * Retrieve the Covering Number based Deviation Upper Probability Bound Generator
  443.      *
  444.      * @return The Covering Number based Deviation Upper Probability Bound Generator
  445.      */

  446.     public org.drip.learning.bound.CoveringNumberLossBound coveringLossBoundEvaluator()
  447.     {
  448.         return _funcClassCNLB;
  449.     }

  450.     /**
  451.      * Compute the Upper Bound of the Probability of the Absolute Deviation of the Empirical Mean from the
  452.      *  Population Mean using the Function Class Supremum Covering Number for General-Purpose Learning
  453.      *
  454.      * @param iSampleSize The Sample Size
  455.      * @param dblEpsilon The Deviation of the Empirical Mean from the Population Mean
  456.      * @param bSupremum TRUE To Use the Supremum Metric in place of the Built-in Metric
  457.      *
  458.      * @return The Upper Bound of the Probability of the Absolute Deviation of the Empirical Mean from the
  459.      *  Population Mean using the Function Class Supremum Covering Number for General-Purpose Learning
  460.      *
  461.      * @throws java.lang.Exception Thrown if the Upper Probability Bound cannot be computed
  462.      */

  463.     public double genericCoveringProbabilityBound (
  464.         final int iSampleSize,
  465.         final double dblEpsilon,
  466.         final boolean bSupremum)
  467.         throws java.lang.Exception
  468.     {
  469.         return _funcClassCNLB.deviationProbabilityUpperBound (iSampleSize, dblEpsilon) * (bSupremum ?
  470.             _funcClassRxToR1.populationSupremumCoveringNumber (dblEpsilon) :
  471.                 _funcClassRxToR1.populationCoveringNumber (dblEpsilon));
  472.     }

  473.     /**
  474.      * Compute the Minimum Possible Sample Size needed to generate the required Upper Probability Bound for
  475.      *  the Specified Empirical Deviation using the Covering Number Convergence Bounds.
  476.      *  
  477.      * @param dblEpsilon The Deviation of the Empirical Mean from the Population Mean
  478.      * @param dblDeviationUpperProbabilityBound The Upper Bound of the Probability for the given Deviation
  479.      * @param bSupremum TRUE To Use the Supremum Metric in place of the Built-in Metric
  480.      *
  481.      * @return The Minimum Possible Sample Size
  482.      *
  483.      * @throws java.lang.Exception Thrown if the Minimum Sample Size cannot be computed
  484.      */

  485.     public double genericCoveringSampleSize (
  486.         final double dblEpsilon,
  487.         final double dblDeviationUpperProbabilityBound,
  488.         final boolean bSupremum)
  489.         throws java.lang.Exception
  490.     {
  491.         if (!org.drip.numerical.common.NumberUtil.IsValid (dblEpsilon) ||
  492.             !org.drip.numerical.common.NumberUtil.IsValid (dblDeviationUpperProbabilityBound))
  493.             throw new java.lang.Exception
  494.                 ("GeneralizedLearner::genericCoveringSampleSize => Invalid Inputs");

  495.         org.drip.function.definition.R1ToR1 funcDeviationUpperProbabilityBound = new
  496.             org.drip.function.definition.R1ToR1 (null) {
  497.             @Override public double evaluate (
  498.                 final double dblSampleSize)
  499.                 throws java.lang.Exception
  500.             {
  501.                 return genericCoveringProbabilityBound ((int) dblSampleSize, dblEpsilon, bSupremum);
  502.             }
  503.         };

  504.         org.drip.function.r1tor1solver.FixedPointFinderOutput fpfo = new
  505.             org.drip.function.r1tor1solver.FixedPointFinderZheng (dblDeviationUpperProbabilityBound,
  506.                 funcDeviationUpperProbabilityBound, false).findRoot();

  507.         if (null == fpfo || !fpfo.containsRoot())
  508.             throw new java.lang.Exception
  509.                 ("GeneralizedLearner::genericCoveringSampleSize => Cannot Estimate Minimal Sample Size");

  510.         return fpfo.getRoot();
  511.     }

  512.     /**
  513.      * Compute the Sample/Data Dependent Upper Bound of the Probability of the Absolute Deviation between
  514.      *  the Empirical and the Population Means using the Function Class Supremum Covering Number for
  515.      *  General-Purpose Learning
  516.      *
  517.      * @param gvvi The Validated Instance Vector Sequence
  518.      * @param iSampleSize The Sample Size
  519.      * @param dblEpsilon The Deviation of the Empirical Mean from the Population Mean
  520.      * @param bSupremum TRUE To Use the Supremum Metric in place of the Built-in Metric
  521.      *
  522.      * @return The Sample/Data Dependent Upper Bound of the Probability of the Absolute Deviation between
  523.      *  the Empirical and the Population Means using the Function Class Supremum Covering Number for
  524.      *  General-Purpose Learning
  525.      *
  526.      * @throws java.lang.Exception Thrown if the Upper Probability Bound cannot be computed
  527.      */

  528.     public double genericCoveringProbabilityBound (
  529.         final org.drip.spaces.instance.GeneralizedValidatedVector gvvi,
  530.         final int iSampleSize,
  531.         final double dblEpsilon,
  532.         final boolean bSupremum)
  533.         throws java.lang.Exception
  534.     {
  535.         return _funcClassCNLB.deviationProbabilityUpperBound (iSampleSize, dblEpsilon) *
  536.             lossSampleCoveringNumber (gvvi, dblEpsilon, bSupremum);
  537.     }

  538.     /**
  539.      * Compute the Minimum Possible Sample Size needed to generate the required Upper Probability Bound for
  540.      *  the Specified Empirical Deviation using the Covering Number Convergence Bounds.
  541.      *  
  542.      * @param gvvi The Validated Instance Vector Sequence
  543.      * @param dblEpsilon The Deviation of the Empirical Mean from the Population Mean
  544.      * @param dblDeviationUpperProbabilityBound The Upper Bound of the Probability for the given Deviation
  545.      * @param bSupremum TRUE To Use the Supremum Metric in place of the Built-in Metric
  546.      *
  547.      * @return The Minimum Possible Sample Size
  548.      *
  549.      * @throws java.lang.Exception Thrown if the Minimum Sample Size cannot be computed
  550.      */

  551.     public double genericCoveringSampleSize (
  552.         final org.drip.spaces.instance.GeneralizedValidatedVector gvvi,
  553.         final double dblEpsilon,
  554.         final double dblDeviationUpperProbabilityBound,
  555.         final boolean bSupremum)
  556.         throws java.lang.Exception
  557.     {
  558.         if (null == gvvi || !org.drip.numerical.common.NumberUtil.IsValid (dblEpsilon) ||
  559.             !org.drip.numerical.common.NumberUtil.IsValid (dblDeviationUpperProbabilityBound))
  560.             throw new java.lang.Exception
  561.                 ("GeneralizedLearner::genericCoveringSampleSize => Invalid Inputs");

  562.         org.drip.function.definition.R1ToR1 funcDeviationUpperProbabilityBound = new
  563.             org.drip.function.definition.R1ToR1 (null) {
  564.             @Override public double evaluate (
  565.                 final double dblSampleSize)
  566.                 throws java.lang.Exception
  567.             {
  568.                 return genericCoveringProbabilityBound (gvvi, (int) dblSampleSize, dblEpsilon, bSupremum);
  569.             }
  570.         };

  571.         org.drip.function.r1tor1solver.FixedPointFinderOutput fpfo = new
  572.             org.drip.function.r1tor1solver.FixedPointFinderZheng (dblDeviationUpperProbabilityBound,
  573.                 funcDeviationUpperProbabilityBound, false).findRoot();

  574.         if (null == fpfo || !fpfo.containsRoot())
  575.             throw new java.lang.Exception
  576.                 ("GeneralizedLearner::genericCoveringSampleSize => Cannot Estimate Minimal Sample Size");

  577.         return fpfo.getRoot();
  578.     }

  579.     /**
  580.      * Compute the Upper Bound of the Probability of the Absolute Deviation between the Empirical and the
  581.      *  Population Means using the Function Class Supremum Covering Number for Regression Learning
  582.      *
  583.      * @param iSampleSize The Sample Size
  584.      * @param dblEpsilon The Deviation of the Empirical Mean from the Population Mean
  585.      * @param bSupremum TRUE To Use the Supremum Metric in place of the Built-in Metric
  586.      *
  587.      * @return The Upper Bound of the Probability of the Absolute Deviation between the Empirical and the
  588.      *  Population Means using the Function Class Supremum Covering Number for Regression Learning
  589.      *
  590.      * @throws java.lang.Exception Thrown if the Upper Probability Bound cannot be computed
  591.      */

  592.     public double regressorCoveringProbabilityBound (
  593.         final int iSampleSize,
  594.         final double dblEpsilon,
  595.         final boolean bSupremum)
  596.         throws java.lang.Exception
  597.     {
  598.         if (!org.drip.numerical.common.NumberUtil.IsValid (dblEpsilon) || 0. >= dblEpsilon || iSampleSize < (2. /
  599.             (dblEpsilon * dblEpsilon)))
  600.             throw new java.lang.Exception
  601.                 ("GeneralizedLearner::regressorCoveringProbabilityBound => Invalid Inputs");

  602.         org.drip.function.definition.R1ToR1 funcSampleCoefficient = new
  603.             org.drip.function.definition.R1ToR1 (null) {
  604.             @Override public double evaluate (
  605.                 final double dblSampleSize)
  606.                 throws java.lang.Exception
  607.             {
  608.                 return 12. * dblSampleSize;
  609.             }
  610.         };

  611.         return (new org.drip.learning.bound.CoveringNumberLossBound (funcSampleCoefficient, 2.,
  612.             36.)).deviationProbabilityUpperBound (iSampleSize, dblEpsilon) * (bSupremum ?
  613.                 _funcClassRxToR1.populationSupremumCoveringNumber (dblEpsilon / 6.) :
  614.                     _funcClassRxToR1.populationCoveringNumber (dblEpsilon / 6.));
  615.     }

  616.     /**
  617.      * Compute the Minimum Possible Sample Size needed to generate the required Upper Probability Bound for
  618.      *  the Specified Empirical Deviation using the Covering Number Convergence Bounds for Regression
  619.      *  Learning.
  620.      *  
  621.      * @param dblEpsilon The Deviation of the Empirical Mean from the Population Mean
  622.      * @param dblDeviationUpperProbabilityBound The Upper Bound of the Probability for the given Deviation
  623.      * @param bSupremum TRUE To Use the Supremum Metric in place of the Built-in Metric
  624.      *
  625.      * @return The Minimum Possible Sample Size
  626.      *
  627.      * @throws java.lang.Exception Thrown if the Minimum Sample Size cannot be computed
  628.      */

  629.     public double regressorCoveringSampleSize (
  630.         final double dblEpsilon,
  631.         final double dblDeviationUpperProbabilityBound,
  632.         final boolean bSupremum)
  633.         throws java.lang.Exception
  634.     {
  635.         if (!org.drip.numerical.common.NumberUtil.IsValid (dblEpsilon) ||
  636.             !org.drip.numerical.common.NumberUtil.IsValid (dblDeviationUpperProbabilityBound))
  637.             throw new java.lang.Exception
  638.                 ("GeneralizedLearner::regressorCoveringSampleSize => Invalid Inputs");

  639.         org.drip.function.definition.R1ToR1 funcDeviationUpperProbabilityBound = new
  640.             org.drip.function.definition.R1ToR1 (null) {
  641.             @Override public double evaluate (
  642.                 final double dblSampleSize)
  643.                 throws java.lang.Exception
  644.             {
  645.                 return regressorCoveringProbabilityBound ((int) dblSampleSize, dblEpsilon, bSupremum);
  646.             }
  647.         };

  648.         org.drip.function.r1tor1solver.FixedPointFinderOutput fpfo = new
  649.             org.drip.function.r1tor1solver.FixedPointFinderZheng (dblDeviationUpperProbabilityBound,
  650.                 funcDeviationUpperProbabilityBound, false).findRoot();

  651.         if (null == fpfo || !fpfo.containsRoot())
  652.             throw new java.lang.Exception
  653.                 ("GeneralizedLearner::regressorCoveringSampleSize => Cannot Estimate Minimal Sample Size");

  654.         return fpfo.getRoot();
  655.     }

  656.     /**
  657.      * Compute the Sample/Data Dependent Upper Bound of the Probability of the Absolute Deviation between
  658.      *  the Empirical and the Population Means using the Function Class Supremum Covering Number for
  659.      *  Regression Learning
  660.      *
  661.      * @param gvvi The Validated Instance Vector Sequence
  662.      * @param iSampleSize The Sample Size
  663.      * @param dblEpsilon The Deviation of the Empirical Mean from the Population Mean
  664.      * @param bSupremum TRUE To Use the Supremum Metric in place of the Built-in Metric
  665.      *
  666.      * @return The Sample/Data Dependent Upper Bound of the Probability of the Absolute Deviation between
  667.      *  the Empirical and the Population Means using the Function Class Supremum Covering Number for
  668.      *  Regression Learning
  669.      *
  670.      * @throws java.lang.Exception Thrown if the Upper Probability Bound cannot be computed
  671.      */

  672.     public double regressorCoveringProbabilityBound (
  673.         final org.drip.spaces.instance.GeneralizedValidatedVector gvvi,
  674.         final int iSampleSize,
  675.         final double dblEpsilon,
  676.         final boolean bSupremum)
  677.         throws java.lang.Exception
  678.     {
  679.         if (!org.drip.numerical.common.NumberUtil.IsValid (dblEpsilon) || 0. >= dblEpsilon || iSampleSize < (2. /
  680.             (dblEpsilon * dblEpsilon)))
  681.             throw new java.lang.Exception
  682.                 ("GeneralizedLearner::regressorCoveringProbabilityBound => Invalid Inputs");

  683.         org.drip.function.definition.R1ToR1 funcSampleCoefficient = new
  684.             org.drip.function.definition.R1ToR1 (null) {
  685.             @Override public double evaluate (
  686.                 final double dblSampleSize)
  687.                 throws java.lang.Exception
  688.             {
  689.                 return 12. * dblSampleSize;
  690.             }
  691.         };

  692.         return (new org.drip.learning.bound.CoveringNumberLossBound (funcSampleCoefficient, 2.,
  693.             36.)).deviationProbabilityUpperBound (iSampleSize, dblEpsilon) * lossSampleCoveringNumber (gvvi,
  694.                 dblEpsilon / 6., bSupremum);
  695.     }

  696.     /**
  697.      * Compute the Minimum Possible Sample Size needed to generate the required Upper Probability Bound for
  698.      *  the Specified Empirical Deviation using the Covering Number Convergence Bounds for Regression
  699.      *  Learning.
  700.      *  
  701.      * @param gvvi The Validated Instance Vector Sequence
  702.      * @param dblEpsilon The Deviation of the Empirical Mean from the Population Mean
  703.      * @param dblDeviationUpperProbabilityBound The Upper Bound of the Probability for the given Deviation
  704.      * @param bSupremum TRUE To Use the Supremum Metric in place of the Built-in Metric
  705.      *
  706.      * @return The Minimum Possible Sample Size
  707.      *
  708.      * @throws java.lang.Exception Thrown if the Minimum Sample Size cannot be computed
  709.      */

  710.     public double regressorCoveringSampleSize (
  711.         final org.drip.spaces.instance.GeneralizedValidatedVector gvvi,
  712.         final double dblEpsilon,
  713.         final double dblDeviationUpperProbabilityBound,
  714.         final boolean bSupremum)
  715.         throws java.lang.Exception
  716.     {
  717.         if (null == gvvi || !org.drip.numerical.common.NumberUtil.IsValid (dblEpsilon) ||
  718.             !org.drip.numerical.common.NumberUtil.IsValid (dblDeviationUpperProbabilityBound))
  719.             throw new java.lang.Exception
  720.                 ("GeneralizedLearner::regressorCoveringSampleSize => Invalid Inputs");

  721.         org.drip.function.definition.R1ToR1 funcDeviationUpperProbabilityBound = new
  722.             org.drip.function.definition.R1ToR1 (null) {
  723.             @Override public double evaluate (
  724.                 final double dblSampleSize)
  725.                 throws java.lang.Exception
  726.             {
  727.                 return regressorCoveringProbabilityBound (gvvi, (int) dblSampleSize, dblEpsilon, bSupremum);
  728.             }
  729.         };

  730.         org.drip.function.r1tor1solver.FixedPointFinderOutput fpfo = new
  731.             org.drip.function.r1tor1solver.FixedPointFinderZheng (dblDeviationUpperProbabilityBound,
  732.                 funcDeviationUpperProbabilityBound, false).findRoot();

  733.         if (null == fpfo || !fpfo.containsRoot())
  734.             throw new java.lang.Exception
  735.                 ("GeneralizedLearner::regressorCoveringSampleSize => Cannot Estimate Minimal Sample Size");

  736.         return fpfo.getRoot();
  737.     }
  738. }