Package org.drip.learning.rxtor1
Class ApproximateLipschitzLossLearner
java.lang.Object
org.drip.learning.rxtor1.GeneralizedLearner
org.drip.learning.rxtor1.LipschitzLossLearner
org.drip.learning.rxtor1.ApproximateLipschitzLossLearner
- All Implemented Interfaces:
EmpiricalLearningMetricEstimator
public class ApproximateLipschitzLossLearner extends LipschitzLossLearner
ApproximateLipschitzLossLearner implements the Learner Class that holds the Space of Normed
Rd To Normed R1 Learning Functions for the Family of Loss Functions that are
"approximately" Lipschitz, i.e.,
loss (ep) - loss (ep') Less Than max (C * |ep-ep'|, C')
The References are:
The References are:
- Alon, N., S. Ben-David, N. Cesa Bianchi, and D. Haussler (1997): Scale-sensitive Dimensions, Uniform Convergence, and Learnability Journal of Association of Computational Machinery 44 (4) 615-631
- Anthony, M., and P. L. Bartlett (1999): Artificial Neural Network Learning - Theoretical Foundations Cambridge University Press Cambridge, UK
- Kearns, M. J., R. E. Schapire, and L. M. Sellie (1994): Towards Efficient Agnostic Learning Machine Learning 17 (2) 115-141
- Lee, W. S., P. L. Bartlett, and R. C. Williamson (1998): The Importance of Convexity in Learning with Squared Loss IEEE Transactions on Information Theory 44 1974-1980
- Vapnik, V. N. (1998): Statistical learning Theory Wiley New York
- Module = Computational Core Module
- Library = Statistical Learning
- Project = Agnostic Learning Bounds under Empirical Loss Minimization Schemes
- Package = Statistical Learning Empirical Loss Penalizer
- Author:
- Lakshmi Krishnamurthy
-
Constructor Summary
Constructors Constructor Description ApproximateLipschitzLossLearner(NormedRxToNormedR1Finite funcClassRxToR1, CoveringNumberLossBound cdpb, RegularizationFunction regularizerFunc, double dblLipschitzSlope, double dblLipschitzFloor)
ApproximateLipschitzLossLearner Constructor -
Method Summary
Modifier and Type Method Description double
lipschitzFloor()
Retrieve the Lipschitz Floordouble
lossSampleCoveringNumber(GeneralizedValidatedVector gvvi, double dblEpsilon, boolean bSupremum)
Retrieve the Loss Class Sample Covering Number - L-Infinity or L-p based BasedMethods inherited from class org.drip.learning.rxtor1.LipschitzLossLearner
empiricalLoss, empiricalLoss, empiricalRisk, empiricalRisk, lipschitzSlope
Methods inherited from class org.drip.learning.rxtor1.GeneralizedLearner
coveringLossBoundEvaluator, functionClass, genericCoveringProbabilityBound, genericCoveringProbabilityBound, genericCoveringSampleSize, genericCoveringSampleSize, regressorCoveringProbabilityBound, regressorCoveringProbabilityBound, regressorCoveringSampleSize, regressorCoveringSampleSize, regularizedLoss, regularizedLoss, regularizedRisk, regularizedRisk, regularizerFunction, structuralLoss, structuralLoss, structuralRisk, structuralRisk, supremumEmpiricalLoss, supremumEmpiricalRisk, supremumEmpiricalRisk, supremumRegularizedLoss, supremumRegularizedRisk, supremumRegularizedRisk, supremumStructuralLoss, supremumStructuralRisk, supremumStructuralRisk
Methods inherited from class java.lang.Object
equals, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait
-
Constructor Details
-
ApproximateLipschitzLossLearner
public ApproximateLipschitzLossLearner(NormedRxToNormedR1Finite funcClassRxToR1, CoveringNumberLossBound cdpb, RegularizationFunction regularizerFunc, double dblLipschitzSlope, double dblLipschitzFloor) throws java.lang.ExceptionApproximateLipschitzLossLearner Constructor- Parameters:
funcClassRxToR1
- R^x To R^1 Function Classcdpb
- The Covering Number based Deviation Upper Probability Bound GeneratorregularizerFunc
- The Regularizer FunctiondblLipschitzSlope
- The Lipschitz Slope BounddblLipschitzFloor
- The Lipschitz Floor Bound- Throws:
java.lang.Exception
- Thrown if the Inputs are Invalid
-
-
Method Details
-
lipschitzFloor
public double lipschitzFloor()Retrieve the Lipschitz Floor- Returns:
- The Lipschitz Floor
-
lossSampleCoveringNumber
public double lossSampleCoveringNumber(GeneralizedValidatedVector gvvi, double dblEpsilon, boolean bSupremum) throws java.lang.ExceptionDescription copied from interface:EmpiricalLearningMetricEstimator
Retrieve the Loss Class Sample Covering Number - L-Infinity or L-p based Based- Specified by:
lossSampleCoveringNumber
in interfaceEmpiricalLearningMetricEstimator
- Overrides:
lossSampleCoveringNumber
in classLipschitzLossLearner
- Parameters:
gvvi
- The Validated Instance Vector SequencedblEpsilon
- The Deviation of the Empirical Mean from the Population MeanbSupremum
- TRUE To Use the Supremum Metric in place of the Built-in Metric- Returns:
- The Loss Class Sample Covering Number - L-Infinity or L-p based Based
- Throws:
java.lang.Exception
- Thrown if the Inputs are Invalid
-