Constructor and Description |
---|
PredictEvalTask()
Create a new eval task.
|
Modifier and Type | Method and Description |
---|---|
void |
addMetric(PredictMetric<?> metric)
Add a prediction metric.
|
ConditionEvaluator |
createConditionEvaluator(AlgorithmInstance algorithm,
DataSet dataSet,
RecommenderEngine engine)
Set up a measurement of a single recommender.
|
void |
finish()
Finalize this eval task.
|
static PredictEvalTask |
fromJSON(com.fasterxml.jackson.databind.JsonNode json,
java.net.URI base)
Create a predict eval task from a JSON/YAML file.
|
java.util.List<Metric<?>> |
getAllMetrics()
Get the list of all metrics.
|
java.util.List<java.lang.String> |
getGlobalColumns()
Get columns that will go in the aggregate output file.
|
java.nio.file.Path |
getOutputFile()
Get the output file for writing predictions.
|
java.util.List<PredictMetric<?>> |
getPredictMetrics()
Get the list of prediction metrics.
|
java.util.Set<java.lang.Class<?>> |
getRequiredRoots()
Get the root types required by this evaluation.
|
java.util.List<java.lang.String> |
getUserColumns()
Get columns that will go in the per-user output file.
|
void |
setOutputFile(java.nio.file.Path file)
Set the output file for predictions.
|
void |
start(ExperimentOutputLayout outputLayout)
Do initial setup for this eval task.
|
public static PredictEvalTask fromJSON(com.fasterxml.jackson.databind.JsonNode json, java.net.URI base) throws java.io.IOException
Create a predict eval task from a JSON/YAML file.
json
- The task specification.base
- The base URI from which json
came, used for resolving relative paths.java.io.IOException
public java.nio.file.Path getOutputFile()
Get the output file for writing predictions.
null
if no file is configured.public void setOutputFile(java.nio.file.Path file)
Set the output file for predictions.
file
- The output file for writing predictions. Will get a CSV file.public java.util.List<PredictMetric<?>> getPredictMetrics()
Get the list of prediction metrics.
public java.util.List<Metric<?>> getAllMetrics()
Get the list of all metrics.
public void addMetric(PredictMetric<?> metric)
Add a prediction metric.
metric
- The metric to add.public java.util.Set<java.lang.Class<?>> getRequiredRoots()
EvalTask
Get the root types required by this evaluation.
getRequiredRoots
in interface EvalTask
public java.util.List<java.lang.String> getGlobalColumns()
EvalTask
Get columns that will go in the aggregate output file.
getGlobalColumns
in interface EvalTask
public java.util.List<java.lang.String> getUserColumns()
EvalTask
Get columns that will go in the per-user output file.
getUserColumns
in interface EvalTask
public void start(ExperimentOutputLayout outputLayout)
EvalTask
Do initial setup for this eval task. This should create any per-task output files, etc.
public void finish()
EvalTask
Finalize this eval task. This should finish writing and close any per-task output files, etc.
public ConditionEvaluator createConditionEvaluator(AlgorithmInstance algorithm, DataSet dataSet, RecommenderEngine engine)
EvalTask
Set up a measurement of a single recommender.
createConditionEvaluator
in interface EvalTask
algorithm
- The algorithm being evaluated.dataSet
- The data set being evaluated.engine
- The recommender engine that will be measured.