aGrUM  0.20.3
a C++ library for (probabilistic) graphical models
gum::learning::BNLearner< GUM_SCALAR > Class Template Reference

A pack of learning algorithms that can easily be used. More...

#include <BNLearner.h>

+ Inheritance diagram for gum::learning::BNLearner< GUM_SCALAR >:
+ Collaboration diagram for gum::learning::BNLearner< GUM_SCALAR >:

Public Attributes

Signaler3< Size, double, doubleonProgress
 Progression, error and time. More...
 
Signaler1< std::string > onStop
 Criteria messageApproximationScheme. More...
 

Public Member Functions

BayesNet< GUM_SCALAR > learnBN ()
 learn a Bayes Net from a file (must have read the db before) More...
 
BayesNet< GUM_SCALAR > learnParameters (const DAG &dag, bool takeIntoAccountScore=true)
 learns a BN (its parameters) when its structure is known More...
 
BayesNet< GUM_SCALAR > learnParameters (bool take_into_account_score=true)
 
void _setAprioriWeight_ (double weight)
 sets the apriori weight More...
 
void setMandatoryArcs (const ArcSet &set)
 assign a set of forbidden arcs More...
 
Constructors / Destructors
 BNLearner (const std::string &filename, const std::vector< std::string > &missing_symbols={"?"})
 default constructor More...
 
 BNLearner (const DatabaseTable<> &db)
 default constructor More...
 
 BNLearner (const std::string &filename, const gum::BayesNet< GUM_SCALAR > &src, const std::vector< std::string > &missing_symbols={"?"})
 Read the database file for the score / parameter estimation and var names. More...
 
 BNLearner (const BNLearner &)
 copy constructor More...
 
 BNLearner (BNLearner &&)
 move constructor More...
 
virtual ~BNLearner ()
 destructor More...
 
Operators
BNLearneroperator= (const BNLearner &)
 copy operator More...
 
BNLearneroperator= (BNLearner &&)
 move operator More...
 
Accessors / Modifiers
DAG learnDAG ()
 learn a structure from a file (must have read the db before) More...
 
MixedGraph learnMixedStructure ()
 learn a partial structure from a file (must have read the db before and must have selected miic or 3off2) More...
 
void setInitialDAG (const DAG &)
 sets an initial DAG structure More...
 
const std::vector< std::string > & names () const
 returns the names of the variables in the database More...
 
const std::vector< std::size_t > & domainSizes () const
 returns the domain sizes of the variables in the database More...
 
Size domainSize (NodeId var) const
 learn a structure from a file (must have read the db before) More...
 
Size domainSize (const std::string &var) const
 learn a structure from a file (must have read the db before) More...
 
NodeId idFromName (const std::string &var_name) const
 returns the node id corresponding to a variable name More...
 
const DatabaseTabledatabase () const
 returns the database used by the BNLearner More...
 
void setDatabaseWeight (const double new_weight)
 assign a weight to all the rows of the learning database so that the sum of their weights is equal to new_weight More...
 
void setRecordWeight (const std::size_t i, const double weight)
 sets the weight of the ith record of the database More...
 
double recordWeight (const std::size_t i) const
 returns the weight of the ith record More...
 
double databaseWeight () const
 returns the weight of the whole database More...
 
const std::string & nameFromId (NodeId id) const
 returns the variable name corresponding to a given node id More...
 
template<template< typename > class XALLOC>
void useDatabaseRanges (const std::vector< std::pair< std::size_t, std::size_t >, XALLOC< std::pair< std::size_t, std::size_t > > > &new_ranges)
 use a new set of database rows' ranges to perform learning More...
 
void clearDatabaseRanges ()
 reset the ranges to the one range corresponding to the whole database More...
 
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges () const
 returns the current database rows' ranges used for learning More...
 
std::pair< std::size_t, std::size_t > useCrossValidationFold (const std::size_t learning_fold, const std::size_t k_fold)
 sets the ranges of rows to be used for cross-validation learning More...
 
std::pair< double, doublechi2 (const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
 Return the <statistic,pvalue> pair for chi2 test in the database. More...
 
std::pair< double, doublechi2 (const std::string &name1, const std::string &name2, const std::vector< std::string > &knowing={})
 Return the <statistic,pvalue> pair for the BNLearner. More...
 
std::pair< double, doubleG2 (const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
 Return the <statistic,pvalue> pair for for G2 test in the database. More...
 
std::pair< double, doubleG2 (const std::string &name1, const std::string &name2, const std::vector< std::string > &knowing={})
 Return the <statistic,pvalue> pair for for G2 test in the database. More...
 
double logLikelihood (const std::vector< NodeId > &vars, const std::vector< NodeId > &knowing={})
 Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner. More...
 
double logLikelihood (const std::vector< std::string > &vars, const std::vector< std::string > &knowing={})
 Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner. More...
 
std::vector< doublerawPseudoCount (const std::vector< NodeId > &vars)
 Return the pseudoconts ofNodeIds vars in the base in a raw array. More...
 
std::vector< doublerawPseudoCount (const std::vector< std::string > &vars)
 Return the pseudoconts of vars in the base in a raw array. More...
 
Size nbCols () const
 
Size nbRows () const
 
void useEM (const double epsilon)
 use The EM algorithm to learn paramters More...
 
bool hasMissingValues () const
 returns true if the learner's database has missing values More...
 
Score selection
void useScoreAIC ()
 indicate that we wish to use an AIC score More...
 
void useScoreBD ()
 indicate that we wish to use a BD score More...
 
void useScoreBDeu ()
 indicate that we wish to use a BDeu score More...
 
void useScoreBIC ()
 indicate that we wish to use a BIC score More...
 
void useScoreK2 ()
 indicate that we wish to use a K2 score More...
 
void useScoreLog2Likelihood ()
 indicate that we wish to use a Log2Likelihood score More...
 
A priori selection / parameterization
void useNoApriori ()
 use no apriori More...
 
void useAprioriBDeu (double weight=1)
 use the BDeu apriori More...
 
void useAprioriSmoothing (double weight=1)
 use the apriori smoothing More...
 
void useAprioriDirichlet (const std::string &filename, double weight=1)
 use the Dirichlet apriori More...
 
std::string checkScoreAprioriCompatibility ()
 checks whether the current score and apriori are compatible More...
 
Learning algorithm selection
void useGreedyHillClimbing ()
 indicate that we wish to use a greedy hill climbing algorithm More...
 
void useLocalSearchWithTabuList (Size tabu_size=100, Size nb_decrease=2)
 indicate that we wish to use a local search with tabu list More...
 
void useK2 (const Sequence< NodeId > &order)
 indicate that we wish to use K2 More...
 
void useK2 (const std::vector< NodeId > &order)
 indicate that we wish to use K2 More...
 
void use3off2 ()
 indicate that we wish to use 3off2 More...
 
void useMIIC ()
 indicate that we wish to use MIIC More...
 
3off2/MIIC parameterization and specific results
void useNMLCorrection ()
 indicate that we wish to use the NML correction for 3off2 and MIIC More...
 
void useMDLCorrection ()
 indicate that we wish to use the MDL correction for 3off2 and MIIC More...
 
void useNoCorrection ()
 indicate that we wish to use the NoCorr correction for 3off2 and MIIC More...
 
const std::vector< ArclatentVariables () const
 get the list of arcs hiding latent variables More...
 
Accessors / Modifiers for adding constraints on learning
void setMaxIndegree (Size max_indegree)
 sets the max indegree More...
 
void setSliceOrder (const NodeProperty< NodeId > &slice_order)
 sets a partial order on the nodes More...
 
void setSliceOrder (const std::vector< std::vector< std::string > > &slices)
 sets a partial order on the nodes More...
 
void setForbiddenArcs (const ArcSet &set)
 assign a set of forbidden arcs More...
 
assign a new forbidden arc
void addForbiddenArc (const Arc &arc)
 
void addForbiddenArc (const NodeId tail, const NodeId head)
 
void addForbiddenArc (const std::string &tail, const std::string &head)
 
void addMandatoryArc (const Arc &arc)
 
void addMandatoryArc (const NodeId tail, const NodeId head)
 
void addMandatoryArc (const std::string &tail, const std::string &head)
 
remove a forbidden arc
void eraseForbiddenArc (const Arc &arc)
 
void eraseForbiddenArc (const NodeId tail, const NodeId head)
 
void eraseForbiddenArc (const std::string &tail, const std::string &head)
 
void eraseMandatoryArc (const Arc &arc)
 
void eraseMandatoryArc (const NodeId tail, const NodeId head)
 
void eraseMandatoryArc (const std::string &tail, const std::string &head)
 
void setPossibleEdges (const EdgeSet &set)
 assign a set of forbidden edges More...
 
void setPossibleSkeleton (const UndiGraph &skeleton)
 assign a set of forbidden edges More...
 
assign a new possible edge
Warning
Once at least one possible edge is defined, all other edges are not possible anymore
void addPossibleEdge (const Edge &edge)
 
void addPossibleEdge (const NodeId tail, const NodeId head)
 
void addPossibleEdge (const std::string &tail, const std::string &head)
 
remove a possible edge
void erasePossibleEdge (const Edge &edge)
 
void erasePossibleEdge (const NodeId tail, const NodeId head)
 
void erasePossibleEdge (const std::string &tail, const std::string &head)
 
redistribute signals AND implemenation of interface
INLINE void setCurrentApproximationScheme (const ApproximationScheme *approximationScheme)
 {@ /// distribute signals More...
 
INLINE void distributeProgress (const ApproximationScheme *approximationScheme, Size pourcent, double error, double time)
 {@ /// distribute signals More...
 
INLINE void distributeStop (const ApproximationScheme *approximationScheme, std::string message)
 distribute signals More...
 
void setEpsilon (double eps)
 Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)| If the criterion was disabled it will be enabled. More...
 
double epsilon () const
 Get the value of epsilon. More...
 
void disableEpsilon ()
 Disable stopping criterion on epsilon. More...
 
void enableEpsilon ()
 Enable stopping criterion on epsilon. More...
 
bool isEnabledEpsilon () const
 
void setMinEpsilonRate (double rate)
 Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|) If the criterion was disabled it will be enabled. More...
 
double minEpsilonRate () const
 Get the value of the minimal epsilon rate. More...
 
void disableMinEpsilonRate ()
 Disable stopping criterion on epsilon rate. More...
 
void enableMinEpsilonRate ()
 Enable stopping criterion on epsilon rate. More...
 
bool isEnabledMinEpsilonRate () const
 
void setMaxIter (Size max)
 stopping criterion on number of iterationsIf the criterion was disabled it will be enabled More...
 
Size maxIter () const
 
void disableMaxIter ()
 Disable stopping criterion on max iterations. More...
 
void enableMaxIter ()
 Enable stopping criterion on max iterations. More...
 
bool isEnabledMaxIter () const
 
void setMaxTime (double timeout)
 stopping criterion on timeout If the criterion was disabled it will be enabled More...
 
double maxTime () const
 returns the timeout (in seconds) More...
 
double currentTime () const
 get the current running time in second (double) More...
 
void disableMaxTime ()
 Disable stopping criterion on timeout. More...
 
void enableMaxTime ()
 stopping criterion on timeout If the criterion was disabled it will be enabled More...
 
bool isEnabledMaxTime () const
 
void setPeriodSize (Size p)
 how many samples between 2 stopping isEnableds More...
 
Size periodSize () const
 how many samples between 2 stopping isEnableds More...
 
void setVerbosity (bool v)
 verbosity More...
 
bool verbosity () const
 verbosity More...
 
ApproximationSchemeSTATE stateApproximationScheme () const
 history More...
 
Size nbrIterations () const
 
const std::vector< double > & history () const
 
Getters and setters
std::string messageApproximationScheme () const
 Returns the approximation scheme message. More...
 

Public Types

enum  ScoreType {
  ScoreType::AIC, ScoreType::BD, ScoreType::BDeu, ScoreType::BIC,
  ScoreType::K2, ScoreType::LOG2LIKELIHOOD
}
 an enumeration enabling to select easily the score we wish to use More...
 
enum  ParamEstimatorType { ParamEstimatorType::ML }
 an enumeration to select the type of parameter estimation we shall apply More...
 
enum  AprioriType { AprioriType::NO_APRIORI, AprioriType::SMOOTHING, AprioriType::DIRICHLET_FROM_DATABASE, AprioriType::BDEU }
 an enumeration to select the apriori More...
 
enum  AlgoType { AlgoType::K2, AlgoType::GREEDY_HILL_CLIMBING, AlgoType::LOCAL_SEARCH_WITH_TABU_LIST, AlgoType::MIIC_THREE_OFF_TWO }
 an enumeration to select easily the learning algorithm to use More...
 
enum  ApproximationSchemeSTATE : char {
  ApproximationSchemeSTATE::Undefined, ApproximationSchemeSTATE::Continue, ApproximationSchemeSTATE::Epsilon, ApproximationSchemeSTATE::Rate,
  ApproximationSchemeSTATE::Limit, ApproximationSchemeSTATE::TimeLimit, ApproximationSchemeSTATE::Stopped
}
 The different state of an approximation scheme. More...
 

Protected Attributes

ScoreType scoreType_ {ScoreType::BDeu}
 the score selected for learning More...
 
Scorescore_ {nullptr}
 the score used More...
 
ParamEstimatorType paramEstimatorType_ {ParamEstimatorType::ML}
 the type of the parameter estimator More...
 
double epsilonEM_ {0.0}
 epsilon for EM. if espilon=0.0 : no EM More...
 
CorrectedMutualInformationmutualInfo_ {nullptr}
 the selected correction for 3off2 and miic More...
 
AprioriType aprioriType_ {AprioriType::NO_APRIORI}
 the a priori selected for the score and parameters More...
 
Aprioriapriori_ {nullptr}
 the apriori used More...
 
AprioriNoApriorinoApriori_ {nullptr}
 
double aprioriWeight_ {1.0f}
 the weight of the apriori More...
 
StructuralConstraintSliceOrder constraintSliceOrder_
 the constraint for 2TBNs More...
 
StructuralConstraintIndegree constraintIndegree_
 the constraint for indegrees More...
 
StructuralConstraintTabuList constraintTabuList_
 the constraint for tabu lists More...
 
StructuralConstraintForbiddenArcs constraintForbiddenArcs_
 the constraint on forbidden arcs More...
 
StructuralConstraintPossibleEdges constraintPossibleEdges_
 the constraint on possible Edges More...
 
StructuralConstraintMandatoryArcs constraintMandatoryArcs_
 the constraint on forbidden arcs More...
 
AlgoType selectedAlgo_ {AlgoType::GREEDY_HILL_CLIMBING}
 the selected learning algorithm More...
 
K2 algoK2_
 the K2 algorithm More...
 
Miic algoMiic3off2_
 the MIIC or 3off2 algorithm More...
 
CorrectedMutualInformation ::KModeTypes kmode3Off2_
 the penalty used in 3off2 More...
 
DAG2BNLearner Dag2BN_
 the parametric EM More...
 
GreedyHillClimbing greedyHillClimbing_
 the greedy hill climbing algorithm More...
 
LocalSearchWithTabuList localSearchWithTabuList_
 the local search with tabu list algorithm More...
 
Database scoreDatabase_
 the database to be used by the scores and parameter estimators More...
 
std::vector< std::pair< std::size_t, std::size_t > > ranges_
 the set of rows' ranges within the database in which learning is done More...
 
DatabaseaprioriDatabase_ {nullptr}
 the database used by the Dirichlet a priori More...
 
std::string aprioriDbname_
 the filename for the Dirichlet a priori, if any More...
 
DAG initialDag_
 an initial DAG given to learners More...
 
const ApproximationSchemecurrentAlgorithm_ {nullptr}
 

Protected Member Functions

void createApriori_ ()
 create the apriori used for learning More...
 
void createScore_ ()
 create the score used for learning More...
 
ParamEstimatorcreateParamEstimator_ (DBRowGeneratorParser<> &parser, bool take_into_account_score=true)
 create the parameter estimator used for learning More...
 
DAG learnDag_ ()
 returns the DAG learnt More...
 
MixedGraph prepareMiic3Off2_ ()
 prepares the initial graph for 3off2 or miic More...
 
const std::string & getAprioriType_ () const
 returns the type (as a string) of a given apriori More...
 
void createCorrectedMutualInformation_ ()
 create the Corrected Mutual Information instance for Miic/3off2 More...
 

Static Protected Member Functions

static DatabaseTable readFile_ (const std::string &filename, const std::vector< std::string > &missing_symbols)
 reads a file and returns a databaseVectInRam More...
 
static void checkFileName_ (const std::string &filename)
 checks whether the extension of a CSV filename is correct More...
 

Detailed Description

template<typename GUM_SCALAR>
class gum::learning::BNLearner< GUM_SCALAR >

A pack of learning algorithms that can easily be used.

The pack currently contains K2, GreedyHillClimbing and LocalSearchWithTabuList

Definition at line 59 of file BNLearner.h.

Member Enumeration Documentation

◆ AlgoType

an enumeration to select easily the learning algorithm to use

Enumerator
K2 
GREEDY_HILL_CLIMBING 
LOCAL_SEARCH_WITH_TABU_LIST 
MIIC_THREE_OFF_TWO 

Definition at line 135 of file genericBNLearner.h.

136  {
137  K2,
138  GREEDY_HILL_CLIMBING,
139  LOCAL_SEARCH_WITH_TABU_LIST,
140  MIIC_THREE_OFF_TWO
141  };

◆ ApproximationSchemeSTATE

The different state of an approximation scheme.

Enumerator
Undefined 
Continue 
Epsilon 
Rate 
Limit 
TimeLimit 
Stopped 

Definition at line 64 of file IApproximationSchemeConfiguration.h.

64  : char
65  {
66  Undefined,
67  Continue,
68  Epsilon,
69  Rate,
70  Limit,
71  TimeLimit,
72  Stopped
73  };

◆ AprioriType

an enumeration to select the apriori

Enumerator
NO_APRIORI 
SMOOTHING 
DIRICHLET_FROM_DATABASE 
BDEU 

Definition at line 126 of file genericBNLearner.h.

127  {
128  NO_APRIORI,
129  SMOOTHING,
130  DIRICHLET_FROM_DATABASE,
131  BDEU
132  };

◆ ParamEstimatorType

an enumeration to select the type of parameter estimation we shall apply

Enumerator
ML 

Definition at line 122 of file genericBNLearner.h.

123  { ML };

◆ ScoreType

an enumeration enabling to select easily the score we wish to use

Enumerator
AIC 
BD 
BDeu 
BIC 
K2 
LOG2LIKELIHOOD 

Definition at line 110 of file genericBNLearner.h.

111  {
112  AIC,
113  BD,
114  BDeu,
115  BIC,
116  K2,
117  LOG2LIKELIHOOD
118  };

Constructor & Destructor Documentation

◆ BNLearner() [1/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const std::string &  filename,
const std::vector< std::string > &  missing_symbols = {"?"} 
)

default constructor

read the database file for the score / parameter estimation and var names

◆ BNLearner() [2/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const DatabaseTable<> &  db)

default constructor

read the database file for the score / parameter estimation and var names

◆ BNLearner() [3/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const std::string &  filename,
const gum::BayesNet< GUM_SCALAR > &  src,
const std::vector< std::string > &  missing_symbols = {"?"} 
)

Read the database file for the score / parameter estimation and var names.

If modalities = { 1 -> {True, False, Big} }, then the node of id 1 in the BN will have 3 modalities, the first one being True, the second one being False, and the third bein Big.

A parsing of the database will allow to determine which ones are really necessary and will keep them in the order specified by the user (NodeProperty modalities). If parse_database is set to false (the default), then the modalities specified by the user will be considered as being exactly those of the variables of the BN (as a consequence, if we find other values in the database, an exception will be raised during learning).

Parameters
filenameThe file to learn from.
modalitiesindicate for some nodes (not necessarily all the nodes of the BN) which modalities they should have and in which order these modalities should be stored into the nodes.
parse_databaseif true, the modalities specified by the user will be considered as a superset of the modalities of the variables. Wrapper for BNLearner (filename,modalities,parse_database) using a bn to find those modalities and nodeids.

◆ BNLearner() [4/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const BNLearner< GUM_SCALAR > &  )

copy constructor

◆ BNLearner() [5/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( BNLearner< GUM_SCALAR > &&  )

move constructor

◆ ~BNLearner()

template<typename GUM_SCALAR >
virtual gum::learning::BNLearner< GUM_SCALAR >::~BNLearner ( )
virtual

destructor

Member Function Documentation

◆ _labelsFromBN_()

template<typename GUM_SCALAR >
NodeProperty< Sequence< std::string > > gum::learning::BNLearner< GUM_SCALAR >::_labelsFromBN_ ( const std::string &  filename,
const BayesNet< GUM_SCALAR > &  src 
)
private

read the first line of a file to find column names

◆ _setAprioriWeight_()

INLINE void gum::learning::genericBNLearner::_setAprioriWeight_ ( double  weight)
inherited

sets the apriori weight

Definition at line 397 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

397  {
398  if (weight < 0) { GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive") }
399 
400  aprioriWeight_ = weight;
402  }
double aprioriWeight_
the weight of the apriori
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
+ Here is the call graph for this function:

◆ addForbiddenArc() [1/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const Arc arc)
inherited

Definition at line 310 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

310  {
312  }
void addArc(const Arc &arc)
assign a new forbidden arc
StructuralConstraintForbiddenArcs constraintForbiddenArcs_
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ addForbiddenArc() [2/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 320 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

320  {
321  addForbiddenArc(Arc(tail, head));
322  }
+ Here is the call graph for this function:

◆ addForbiddenArc() [3/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 330 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

331  {
332  addForbiddenArc(Arc(idFromName(tail), idFromName(head)));
333  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ addMandatoryArc() [1/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const Arc arc)
inherited

Definition at line 347 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

347  {
349  }
void addArc(const Arc &arc)
assign a new forbidden arc
StructuralConstraintMandatoryArcs constraintMandatoryArcs_
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ addMandatoryArc() [2/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 369 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

369  {
370  addMandatoryArc(Arc(tail, head));
371  }
+ Here is the call graph for this function:

◆ addMandatoryArc() [3/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 357 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

358  {
359  addMandatoryArc(Arc(idFromName(tail), idFromName(head)));
360  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ addPossibleEdge() [1/3]

INLINE void gum::learning::genericBNLearner::addPossibleEdge ( const Edge edge)
inherited

Definition at line 273 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

273  {
275  }
void addEdge(const Edge &edge)
assign a new forbidden arc
StructuralConstraintPossibleEdges constraintPossibleEdges_
the constraint on possible Edges
+ Here is the call graph for this function:

◆ addPossibleEdge() [2/3]

INLINE void gum::learning::genericBNLearner::addPossibleEdge ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 283 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

283  {
284  addPossibleEdge(Edge(tail, head));
285  }
void addPossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ addPossibleEdge() [3/3]

INLINE void gum::learning::genericBNLearner::addPossibleEdge ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 293 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

294  {
295  addPossibleEdge(Edge(idFromName(tail), idFromName(head)));
296  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
void addPossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ checkFileName_()

void gum::learning::genericBNLearner::checkFileName_ ( const std::string &  filename)
staticprotectedinherited

checks whether the extension of a CSV filename is correct

Definition at line 382 of file genericBNLearner.cpp.

382  {
383  // get the extension of the file
384  Size filename_size = Size(filename.size());
385 
386  if (filename_size < 4) {
387  GUM_ERROR(FormatNotFound,
388  "genericBNLearner could not determine the "
389  "file type of the database");
390  }
391 
392  std::string extension = filename.substr(filename.size() - 4);
393  std::transform(extension.begin(), extension.end(), extension.begin(), ::tolower);
394 
395  if (extension != ".csv") {
396  GUM_ERROR(OperationNotAllowed,
397  "genericBNLearner does not support yet this type of database file");
398  }
399  }
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:47
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ checkScoreAprioriCompatibility()

std::string gum::learning::genericBNLearner::checkScoreAprioriCompatibility ( )
inherited

checks whether the current score and apriori are compatible

Returns
a non empty string if the apriori is somehow compatible with the score.

Definition at line 816 of file genericBNLearner.cpp.

816  {
817  const std::string& apriori = getAprioriType_();
818 
819  switch (scoreType_) {
820  case ScoreType::AIC:
822 
823  case ScoreType::BD:
825 
826  case ScoreType::BDeu:
828 
829  case ScoreType::BIC:
831 
832  case ScoreType::K2:
834 
837 
838  default:
839  return "genericBNLearner does not support yet this score";
840  }
841  }
double aprioriWeight_
the weight of the apriori
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
ScoreType scoreType_
the score selected for learning
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
const std::string & getAprioriType_() const
returns the type (as a string) of a given apriori
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score

◆ chi2() [1/2]

std::pair< double, double > gum::learning::genericBNLearner::chi2 ( const NodeId  id1,
const NodeId  id2,
const std::vector< NodeId > &  knowing = {} 
)
inherited

Return the <statistic,pvalue> pair for chi2 test in the database.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 885 of file genericBNLearner.cpp.

887  {
888  createApriori_();
890  *apriori_,
891  databaseRanges());
892 
893  return chi2score.statistics(id1, id2, knowing);
894  }
the class for computing Chi2 independence test scores
Definition: indepTestChi2.h:47
std::pair< double, double > statistics(NodeId var1, NodeId var2, const std::vector< NodeId, ALLOC< NodeId > > &rhs_ids={})
get the pair <chi2 statistic,pvalue> for a test var1 indep var2 given rhs_ids
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
void createApriori_()
create the apriori used for learning
DBRowGeneratorParser & parser()
returns the parser for the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
Apriori * apriori_
the apriori used

◆ chi2() [2/2]

std::pair< double, double > gum::learning::genericBNLearner::chi2 ( const std::string &  name1,
const std::string &  name2,
const std::vector< std::string > &  knowing = {} 
)
inherited

Return the <statistic,pvalue> pair for the BNLearner.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 896 of file genericBNLearner.cpp.

898  {
899  std::vector< NodeId > knowingIds;
900  std::transform(knowing.begin(),
901  knowing.end(),
902  std::back_inserter(knowingIds),
903  [this](const std::string& c) -> NodeId { return this->idFromName(c); });
904  return chi2(idFromName(name1), idFromName(name2), knowingIds);
905  }
std::pair< double, double > chi2(const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
Return the <statistic,pvalue> pair for chi2 test in the database.
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Size NodeId
Type for node ids.
Definition: graphElements.h:97

◆ clearDatabaseRanges()

INLINE void gum::learning::genericBNLearner::clearDatabaseRanges ( )
inherited

reset the ranges to the one range corresponding to the whole database

Definition at line 491 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

491 { ranges_.clear(); }
std::vector< std::pair< std::size_t, std::size_t > > ranges_
the set of rows&#39; ranges within the database in which learning is done
+ Here is the call graph for this function:

◆ createApriori_()

void gum::learning::genericBNLearner::createApriori_ ( )
protectedinherited

create the apriori used for learning

Definition at line 428 of file genericBNLearner.cpp.

428  {
429  // first, save the old apriori, to be delete if everything is ok
430  Apriori<>* old_apriori = apriori_;
431 
432  // create the new apriori
433  switch (aprioriType_) {
435  apriori_ = new AprioriNoApriori<>(scoreDatabase_.databaseTable(),
437  break;
438 
440  apriori_ = new AprioriSmoothing<>(scoreDatabase_.databaseTable(),
442  break;
443 
445  if (aprioriDatabase_ != nullptr) {
446  delete aprioriDatabase_;
447  aprioriDatabase_ = nullptr;
448  }
449 
452 
453  apriori_ = new AprioriDirichletFromDatabase<>(scoreDatabase_.databaseTable(),
456  break;
457 
458  case AprioriType::BDEU:
459  apriori_
460  = new AprioriBDeu<>(scoreDatabase_.databaseTable(), scoreDatabase_.nodeId2Columns());
461  break;
462 
463  default:
464  GUM_ERROR(OperationNotAllowed, "The BNLearner does not support yet this apriori")
465  }
466 
467  // do not forget to assign a weight to the apriori
468  apriori_->setWeight(aprioriWeight_);
469 
470  // remove the old apriori, if any
471  if (old_apriori != nullptr) delete old_apriori;
472  }
double aprioriWeight_
the weight of the apriori
std::string aprioriDbname_
the filename for the Dirichlet a priori, if any
const std::vector< std::string > & missingSymbols() const
returns the set of missing symbols taken into account
Database * aprioriDatabase_
the database used by the Dirichlet a priori
const DatabaseTable & databaseTable() const
returns the internal database table
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
DBRowGeneratorParser & parser()
returns the parser for the database
AprioriType aprioriType_
the a priori selected for the score and parameters
Database scoreDatabase_
the database to be used by the scores and parameter estimators
Apriori * apriori_
the apriori used
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ createCorrectedMutualInformation_()

void gum::learning::genericBNLearner::createCorrectedMutualInformation_ ( )
protectedinherited

create the Corrected Mutual Information instance for Miic/3off2

Definition at line 622 of file genericBNLearner.cpp.

622  {
623  if (mutualInfo_ != nullptr) delete mutualInfo_;
624 
625  mutualInfo_ = new CorrectedMutualInformation<>(scoreDatabase_.parser(),
626  *noApriori_,
627  ranges_,
629  switch (kmode3Off2_) {
631  mutualInfo_->useMDL();
632  break;
633 
635  mutualInfo_->useNML();
636  break;
637 
640  break;
641 
642  default:
643  GUM_ERROR(NotImplementedYet,
644  "The BNLearner's corrected mutual information class does "
645  << "not implement yet this correction : " << int(kmode3Off2_));
646  }
647  }
void useNML()
use the kNML penalty function
std::vector< std::pair< std::size_t, std::size_t > > ranges_
the set of rows&#39; ranges within the database in which learning is done
CorrectedMutualInformation * mutualInfo_
the selected correction for 3off2 and miic
CorrectedMutualInformation ::KModeTypes kmode3Off2_
the penalty used in 3off2
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
DBRowGeneratorParser & parser()
returns the parser for the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
void useMDL()
use the MDL penalty function
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
void useNoCorr()
use no correction/penalty function

◆ createParamEstimator_()

ParamEstimator * gum::learning::genericBNLearner::createParamEstimator_ ( DBRowGeneratorParser<> &  parser,
bool  take_into_account_score = true 
)
protectedinherited

create the parameter estimator used for learning

Definition at line 530 of file genericBNLearner.cpp.

531  {
532  ParamEstimator<>* param_estimator = nullptr;
533 
534  // create the new estimator
535  switch (paramEstimatorType_) {
537  if (take_into_account_score && (score_ != nullptr)) {
538  param_estimator = new ParamEstimatorML<>(parser,
539  *apriori_,
541  ranges_,
543  } else {
544  param_estimator = new ParamEstimatorML<>(parser,
545  *apriori_,
546  *noApriori_,
547  ranges_,
549  }
550 
551  break;
552 
553  default:
554  GUM_ERROR(OperationNotAllowed,
555  "genericBNLearner does not support "
556  << "yet this parameter estimator");
557  }
558 
559  // assign the set of ranges
560  param_estimator->setRanges(ranges_);
561 
562  return param_estimator;
563  }
std::vector< std::pair< std::size_t, std::size_t > > ranges_
the set of rows&#39; ranges within the database in which learning is done
ParamEstimatorType paramEstimatorType_
the type of the parameter estimator
virtual const Apriori< ALLOC > & internalApriori() const =0
returns the internal apriori of the score
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
Apriori * apriori_
the apriori used
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
Score * score_
the score used

◆ createScore_()

void gum::learning::genericBNLearner::createScore_ ( )
protectedinherited

create the score used for learning

Definition at line 474 of file genericBNLearner.cpp.

474  {
475  // first, save the old score, to be delete if everything is ok
476  Score<>* old_score = score_;
477 
478  // create the new scoring function
479  switch (scoreType_) {
480  case ScoreType::AIC:
481  score_ = new ScoreAIC<>(scoreDatabase_.parser(),
482  *apriori_,
483  ranges_,
485  break;
486 
487  case ScoreType::BD:
488  score_ = new ScoreBD<>(scoreDatabase_.parser(),
489  *apriori_,
490  ranges_,
492  break;
493 
494  case ScoreType::BDeu:
495  score_ = new ScoreBDeu<>(scoreDatabase_.parser(),
496  *apriori_,
497  ranges_,
499  break;
500 
501  case ScoreType::BIC:
502  score_ = new ScoreBIC<>(scoreDatabase_.parser(),
503  *apriori_,
504  ranges_,
506  break;
507 
508  case ScoreType::K2:
509  score_ = new ScoreK2<>(scoreDatabase_.parser(),
510  *apriori_,
511  ranges_,
513  break;
514 
516  score_ = new ScoreLog2Likelihood<>(scoreDatabase_.parser(),
517  *apriori_,
518  ranges_,
520  break;
521 
522  default:
523  GUM_ERROR(OperationNotAllowed, "genericBNLearner does not support yet this score")
524  }
525 
526  // remove the old score, if any
527  if (old_score != nullptr) delete old_score;
528  }
std::vector< std::pair< std::size_t, std::size_t > > ranges_
the set of rows&#39; ranges within the database in which learning is done
ScoreType scoreType_
the score selected for learning
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
DBRowGeneratorParser & parser()
returns the parser for the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
Apriori * apriori_
the apriori used
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
Score * score_
the score used

◆ currentTime()

double gum::learning::genericBNLearner::currentTime ( ) const
inlinevirtualinherited

get the current running time in second (double)

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1048 of file genericBNLearner.h.

1048  {
1049  if (currentAlgorithm_ != nullptr)
1050  return currentAlgorithm_->currentTime();
1051  else
1052  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1053  }
const ApproximationScheme * currentAlgorithm_
double currentTime() const
Returns the current running time in second.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ database()

INLINE const DatabaseTable & gum::learning::genericBNLearner::database ( ) const
inherited

returns the database used by the BNLearner

Definition at line 494 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

494  {
495  return scoreDatabase_.databaseTable();
496  }
const DatabaseTable & databaseTable() const
returns the internal database table
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ databaseRanges()

INLINE const std::vector< std::pair< std::size_t, std::size_t > > & gum::learning::genericBNLearner::databaseRanges ( ) const
inherited

returns the current database rows' ranges used for learning

Returns
The method returns a vector of pairs [Xi,Yi) of indices of rows in the database. The learning is performed on these set of rows.
Warning
an empty set of ranges means the whole database.

Definition at line 486 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

486  {
487  return ranges_;
488  }
std::vector< std::pair< std::size_t, std::size_t > > ranges_
the set of rows&#39; ranges within the database in which learning is done
+ Here is the call graph for this function:

◆ databaseWeight()

INLINE double gum::learning::genericBNLearner::databaseWeight ( ) const
inherited

returns the weight of the whole database

Definition at line 153 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

153 { return scoreDatabase_.weight(); }
double weight(const std::size_t i) const
returns the weight of the ith record
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ disableEpsilon()

void gum::learning::genericBNLearner::disableEpsilon ( )
inlinevirtualinherited

Disable stopping criterion on epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 913 of file genericBNLearner.h.

913  {
918  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void disableEpsilon()
Disable stopping criterion on epsilon.
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ disableMaxIter()

void gum::learning::genericBNLearner::disableMaxIter ( )
inlinevirtualinherited

Disable stopping criterion on max iterations.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1003 of file genericBNLearner.h.

1003  {
1008  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
void disableMaxIter()
Disable stopping criterion on max iterations.
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ disableMaxTime()

void gum::learning::genericBNLearner::disableMaxTime ( )
inlinevirtualinherited

Disable stopping criterion on timeout.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1056 of file genericBNLearner.h.

1056  {
1061  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
void disableMaxTime()
Disable stopping criterion on timeout.
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ disableMinEpsilonRate()

void gum::learning::genericBNLearner::disableMinEpsilonRate ( )
inlinevirtualinherited

Disable stopping criterion on epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 959 of file genericBNLearner.h.

959  {
964  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void disableMinEpsilonRate()
Disable stopping criterion on epsilon rate.
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ distributeProgress()

INLINE void gum::learning::genericBNLearner::distributeProgress ( const ApproximationScheme approximationScheme,
Size  pourcent,
double  error,
double  time 
)
inlineinherited

{@ /// distribute signals

Definition at line 875 of file genericBNLearner.h.

878  {
879  setCurrentApproximationScheme(approximationScheme);
880 
881  if (onProgress.hasListener()) GUM_EMIT3(onProgress, pourcent, error, time);
882  };
INLINE void setCurrentApproximationScheme(const ApproximationScheme *approximationScheme)
{@ /// distribute signals
Signaler3< Size, double, double > onProgress
Progression, error and time.
#define GUM_EMIT3(signal, arg1, arg2, arg3)
Definition: signaler3.h:41

◆ distributeStop()

INLINE void gum::learning::genericBNLearner::distributeStop ( const ApproximationScheme approximationScheme,
std::string  message 
)
inlineinherited

distribute signals

Definition at line 885 of file genericBNLearner.h.

886  {
887  setCurrentApproximationScheme(approximationScheme);
888 
889  if (onStop.hasListener()) GUM_EMIT1(onStop, message);
890  };
INLINE void setCurrentApproximationScheme(const ApproximationScheme *approximationScheme)
{@ /// distribute signals
#define GUM_EMIT1(signal, arg1)
Definition: signaler1.h:41
Signaler1< std::string > onStop
Criteria messageApproximationScheme.

◆ domainSize() [1/2]

INLINE Size gum::learning::genericBNLearner::domainSize ( NodeId  var) const
inherited

learn a structure from a file (must have read the db before)

Definition at line 476 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

476  {
477  return scoreDatabase_.domainSizes()[var];
478  }
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ domainSize() [2/2]

INLINE Size gum::learning::genericBNLearner::domainSize ( const std::string &  var) const
inherited

learn a structure from a file (must have read the db before)

Definition at line 480 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

480  {
481  return scoreDatabase_.domainSizes()[idFromName(var)];
482  }
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ domainSizes()

INLINE const std::vector< std::size_t > & gum::learning::genericBNLearner::domainSizes ( ) const
inherited

returns the domain sizes of the variables in the database

Definition at line 471 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

471  {
472  return scoreDatabase_.domainSizes();
473  }
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ enableEpsilon()

void gum::learning::genericBNLearner::enableEpsilon ( )
inlinevirtualinherited

Enable stopping criterion on epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 921 of file genericBNLearner.h.

921  {
926  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm
void enableEpsilon()
Enable stopping criterion on epsilon.

◆ enableMaxIter()

void gum::learning::genericBNLearner::enableMaxIter ( )
inlinevirtualinherited

Enable stopping criterion on max iterations.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1010 of file genericBNLearner.h.

1010  {
1015  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
void enableMaxIter()
Enable stopping criterion on max iterations.
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ enableMaxTime()

void gum::learning::genericBNLearner::enableMaxTime ( )
inlinevirtualinherited

stopping criterion on timeout If the criterion was disabled it will be enabled

Exceptions
OutOfLowerBoundif timeout<=0.0 timeout is time in second (double).

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1062 of file genericBNLearner.h.

1062  {
1067  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm
void enableMaxTime()
Enable stopping criterion on timeout.

◆ enableMinEpsilonRate()

void gum::learning::genericBNLearner::enableMinEpsilonRate ( )
inlinevirtualinherited

Enable stopping criterion on epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 966 of file genericBNLearner.h.

966  {
971  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void enableMinEpsilonRate()
Enable stopping criterion on epsilon rate.
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ epsilon()

double gum::learning::genericBNLearner::epsilon ( ) const
inlinevirtualinherited

Get the value of epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 905 of file genericBNLearner.h.

905  {
906  if (currentAlgorithm_ != nullptr)
907  return currentAlgorithm_->epsilon();
908  else
909  GUM_ERROR(FatalError, "No chosen algorithm for learning")
910  }
const ApproximationScheme * currentAlgorithm_
double epsilon() const
Returns the value of epsilon.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ eraseForbiddenArc() [1/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const Arc arc)
inherited

Definition at line 315 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

315  {
317  }
void eraseArc(const Arc &arc)
remove a forbidden arc
StructuralConstraintForbiddenArcs constraintForbiddenArcs_
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ eraseForbiddenArc() [2/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 325 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

325  {
326  eraseForbiddenArc(Arc(tail, head));
327  }
+ Here is the call graph for this function:

◆ eraseForbiddenArc() [3/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 336 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

337  {
338  eraseForbiddenArc(Arc(idFromName(tail), idFromName(head)));
339  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ eraseMandatoryArc() [1/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const Arc arc)
inherited

Definition at line 352 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

352  {
354  }
StructuralConstraintMandatoryArcs constraintMandatoryArcs_
the constraint on forbidden arcs
void eraseArc(const Arc &arc)
remove a forbidden arc
+ Here is the call graph for this function:

◆ eraseMandatoryArc() [2/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 374 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

374  {
375  eraseMandatoryArc(Arc(tail, head));
376  }
+ Here is the call graph for this function:

◆ eraseMandatoryArc() [3/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 363 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

364  {
365  eraseMandatoryArc(Arc(idFromName(tail), idFromName(head)));
366  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ erasePossibleEdge() [1/3]

INLINE void gum::learning::genericBNLearner::erasePossibleEdge ( const Edge edge)
inherited

Definition at line 278 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

278  {
280  }
StructuralConstraintPossibleEdges constraintPossibleEdges_
the constraint on possible Edges
void eraseEdge(const Edge &edge)
remove a forbidden arc
+ Here is the call graph for this function:

◆ erasePossibleEdge() [2/3]

INLINE void gum::learning::genericBNLearner::erasePossibleEdge ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 288 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

288  {
289  erasePossibleEdge(Edge(tail, head));
290  }
void erasePossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ erasePossibleEdge() [3/3]

INLINE void gum::learning::genericBNLearner::erasePossibleEdge ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 299 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

300  {
301  erasePossibleEdge(Edge(idFromName(tail), idFromName(head)));
302  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
void erasePossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ G2() [1/2]

std::pair< double, double > gum::learning::genericBNLearner::G2 ( const NodeId  id1,
const NodeId  id2,
const std::vector< NodeId > &  knowing = {} 
)
inherited

Return the <statistic,pvalue> pair for for G2 test in the database.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 907 of file genericBNLearner.cpp.

909  {
910  createApriori_();
912  return g2score.statistics(id1, id2, knowing);
913  }
the class for computing G2 independence test scores
Definition: indepTestG2.h:47
std::pair< double, double > statistics(NodeId var1, NodeId var2, const std::vector< NodeId, ALLOC< NodeId > > &rhs_ids={})
get the pair <G2statistic,pvalue> for a test var1 indep var2 given rhs_ids
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
void createApriori_()
create the apriori used for learning
DBRowGeneratorParser & parser()
returns the parser for the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
Apriori * apriori_
the apriori used

◆ G2() [2/2]

std::pair< double, double > gum::learning::genericBNLearner::G2 ( const std::string &  name1,
const std::string &  name2,
const std::vector< std::string > &  knowing = {} 
)
inherited

Return the <statistic,pvalue> pair for for G2 test in the database.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 915 of file genericBNLearner.cpp.

917  {
918  std::vector< NodeId > knowingIds;
919  std::transform(knowing.begin(),
920  knowing.end(),
921  std::back_inserter(knowingIds),
922  [this](const std::string& c) -> NodeId { return this->idFromName(c); });
923  return G2(idFromName(name1), idFromName(name2), knowingIds);
924  }
std::pair< double, double > G2(const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
Return the <statistic,pvalue> pair for for G2 test in the database.
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Size NodeId
Type for node ids.
Definition: graphElements.h:97

◆ getAprioriType_()

INLINE const std::string & gum::learning::genericBNLearner::getAprioriType_ ( ) const
protectedinherited

returns the type (as a string) of a given apriori

Definition at line 444 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

444  {
445  switch (aprioriType_) {
448 
451 
454 
455  case AprioriType::BDEU:
457 
458  default:
459  GUM_ERROR(OperationNotAllowed,
460  "genericBNLearner getAprioriType does "
461  "not support yet this apriori");
462  }
463  }
static const std::string type
Definition: aprioriTypes.h:42
static const std::string type
Definition: aprioriTypes.h:47
static const std::string type
Definition: aprioriTypes.h:52
AprioriType aprioriType_
the a priori selected for the score and parameters
static const std::string type
Definition: aprioriTypes.h:37
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
+ Here is the call graph for this function:

◆ hasMissingValues()

INLINE bool gum::learning::genericBNLearner::hasMissingValues ( ) const
inherited

returns true if the learner's database has missing values

Definition at line 259 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

259  {
261  }
bool hasMissingValues() const
indicates whether the database contains some missing values
const DatabaseTable & databaseTable() const
returns the internal database table
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ history()

const std::vector< double >& gum::learning::genericBNLearner::history ( ) const
inlinevirtualinherited
Exceptions
OperationNotAllowedif scheme not performed or verbosity=false

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1132 of file genericBNLearner.h.

1132  {
1133  if (currentAlgorithm_ != nullptr)
1134  return currentAlgorithm_->history();
1135  else
1136  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1137  }
const ApproximationScheme * currentAlgorithm_
const std::vector< double > & history() const
Returns the scheme history.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ idFromName()

INLINE NodeId gum::learning::genericBNLearner::idFromName ( const std::string &  var_name) const
inherited

returns the node id corresponding to a variable name

Exceptions
MissingVariableInDatabaseif a variable of the BN is not found in the database.

Definition at line 128 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

128  {
129  return scoreDatabase_.idFromName(var_name);
130  }
Database scoreDatabase_
the database to be used by the scores and parameter estimators
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ isEnabledEpsilon()

bool gum::learning::genericBNLearner::isEnabledEpsilon ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on epsilon is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 930 of file genericBNLearner.h.

930  {
931  if (currentAlgorithm_ != nullptr)
933  else
934  GUM_ERROR(FatalError, "No chosen algorithm for learning")
935  }
const ApproximationScheme * currentAlgorithm_
bool isEnabledEpsilon() const
Returns true if stopping criterion on epsilon is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ isEnabledMaxIter()

bool gum::learning::genericBNLearner::isEnabledMaxIter ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on max iterations is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1018 of file genericBNLearner.h.

1018  {
1019  if (currentAlgorithm_ != nullptr)
1021  else
1022  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1023  }
const ApproximationScheme * currentAlgorithm_
bool isEnabledMaxIter() const
Returns true if stopping criterion on max iterations is enabled, false otherwise. ...
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ isEnabledMaxTime()

bool gum::learning::genericBNLearner::isEnabledMaxTime ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on timeout is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1070 of file genericBNLearner.h.

1070  {
1071  if (currentAlgorithm_ != nullptr)
1073  else
1074  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1075  }
const ApproximationScheme * currentAlgorithm_
bool isEnabledMaxTime() const
Returns true if stopping criterion on timeout is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ isEnabledMinEpsilonRate()

bool gum::learning::genericBNLearner::isEnabledMinEpsilonRate ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on epsilon rate is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 974 of file genericBNLearner.h.

974  {
975  if (currentAlgorithm_ != nullptr)
977  else
978  GUM_ERROR(FatalError, "No chosen algorithm for learning")
979  }
const ApproximationScheme * currentAlgorithm_
bool isEnabledMinEpsilonRate() const
Returns true if stopping criterion on epsilon rate is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ latentVariables()

INLINE const std::vector< Arc > gum::learning::genericBNLearner::latentVariables ( ) const
inherited

get the list of arcs hiding latent variables

Exceptions
OperationNotAllowedwhen 3off2 or MIIC is not the selected algorithm

Definition at line 227 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

227  {
229  }
const std::vector< Arc > latentVariables() const
get the list of arcs hiding latent variables
Definition: Miic.cpp:941
Miic algoMiic3off2_
the MIIC or 3off2 algorithm
+ Here is the call graph for this function:

◆ learnBN()

template<typename GUM_SCALAR >
BayesNet< GUM_SCALAR > gum::learning::BNLearner< GUM_SCALAR >::learnBN ( )

learn a Bayes Net from a file (must have read the db before)

◆ learnDAG()

DAG gum::learning::genericBNLearner::learnDAG ( )
inherited

learn a structure from a file (must have read the db before)

Definition at line 614 of file genericBNLearner.cpp.

614  {
615  // create the score and the apriori
616  createApriori_();
617  createScore_();
618 
619  return learnDag_();
620  }
void createScore_()
create the score used for learning
DAG learnDag_()
returns the DAG learnt
void createApriori_()
create the apriori used for learning

◆ learnDag_()

DAG gum::learning::genericBNLearner::learnDag_ ( )
protectedinherited

returns the DAG learnt

Definition at line 649 of file genericBNLearner.cpp.

649  {
650  // check that the database does not contain any missing value
652  || ((aprioriDatabase_ != nullptr)
655  GUM_ERROR(MissingValueInDatabase,
656  "For the moment, the BNLearner is unable to cope "
657  "with missing values in databases");
658  }
659  // add the mandatory arcs to the initial dag and remove the forbidden ones
660  // from the initial graph
661  DAG init_graph = initialDag_;
662 
663  const ArcSet& mandatory_arcs = constraintMandatoryArcs_.arcs();
664 
665  for (const auto& arc: mandatory_arcs) {
666  if (!init_graph.exists(arc.tail())) init_graph.addNodeWithId(arc.tail());
667 
668  if (!init_graph.exists(arc.head())) init_graph.addNodeWithId(arc.head());
669 
670  init_graph.addArc(arc.tail(), arc.head());
671  }
672 
673  const ArcSet& forbidden_arcs = constraintForbiddenArcs_.arcs();
674 
675  for (const auto& arc: forbidden_arcs) {
676  init_graph.eraseArc(arc);
677  }
678 
679  switch (selectedAlgo_) {
680  // ========================================================================
682  BNLearnerListener listener(this, algoMiic3off2_);
683  // create the mixedGraph and the corrected mutual information
684  MixedGraph mgraph = this->prepareMiic3Off2_();
685 
686  return algoMiic3off2_.learnStructure(*mutualInfo_, mgraph);
687  }
688 
689  // ========================================================================
691  BNLearnerListener listener(this, greedyHillClimbing_);
692  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
693  StructuralConstraintForbiddenArcs,
694  StructuralConstraintPossibleEdges,
695  StructuralConstraintSliceOrder >
696  gen_constraint;
697  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint)
699  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint)
701  static_cast< StructuralConstraintPossibleEdges& >(gen_constraint)
703  static_cast< StructuralConstraintSliceOrder& >(gen_constraint) = constraintSliceOrder_;
704 
705  GraphChangesGenerator4DiGraph< decltype(gen_constraint) > op_set(gen_constraint);
706 
707  StructuralConstraintSetStatic< StructuralConstraintIndegree, StructuralConstraintDAG >
708  sel_constraint;
709  static_cast< StructuralConstraintIndegree& >(sel_constraint) = constraintIndegree_;
710 
711  GraphChangesSelector4DiGraph< decltype(sel_constraint), decltype(op_set) > selector(
712  *score_,
713  sel_constraint,
714  op_set);
715 
716  return greedyHillClimbing_.learnStructure(selector, init_graph);
717  }
718 
719  // ========================================================================
721  BNLearnerListener listener(this, localSearchWithTabuList_);
722  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
723  StructuralConstraintForbiddenArcs,
724  StructuralConstraintPossibleEdges,
725  StructuralConstraintSliceOrder >
726  gen_constraint;
727  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint)
729  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint)
731  static_cast< StructuralConstraintPossibleEdges& >(gen_constraint)
733  static_cast< StructuralConstraintSliceOrder& >(gen_constraint) = constraintSliceOrder_;
734 
735  GraphChangesGenerator4DiGraph< decltype(gen_constraint) > op_set(gen_constraint);
736 
737  StructuralConstraintSetStatic< StructuralConstraintTabuList,
738  StructuralConstraintIndegree,
739  StructuralConstraintDAG >
740  sel_constraint;
741  static_cast< StructuralConstraintTabuList& >(sel_constraint) = constraintTabuList_;
742  static_cast< StructuralConstraintIndegree& >(sel_constraint) = constraintIndegree_;
743 
744  GraphChangesSelector4DiGraph< decltype(sel_constraint), decltype(op_set) > selector(
745  *score_,
746  sel_constraint,
747  op_set);
748 
749  return localSearchWithTabuList_.learnStructure(selector, init_graph);
750  }
751 
752  // ========================================================================
753  case AlgoType::K2: {
754  BNLearnerListener listener(this, algoK2_.approximationScheme());
755  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
756  StructuralConstraintForbiddenArcs,
757  StructuralConstraintPossibleEdges >
758  gen_constraint;
759  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint)
761  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint)
763  static_cast< StructuralConstraintPossibleEdges& >(gen_constraint)
765 
766  GraphChangesGenerator4K2< decltype(gen_constraint) > op_set(gen_constraint);
767 
768  // if some mandatory arcs are incompatible with the order, use a DAG
769  // constraint instead of a DiGraph constraint to avoid cycles
770  const ArcSet& mandatory_arcs
771  = static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint).arcs();
772  const Sequence< NodeId >& order = algoK2_.order();
773  bool order_compatible = true;
774 
775  for (const auto& arc: mandatory_arcs) {
776  if (order.pos(arc.tail()) >= order.pos(arc.head())) {
777  order_compatible = false;
778  break;
779  }
780  }
781 
782  if (order_compatible) {
783  StructuralConstraintSetStatic< StructuralConstraintIndegree,
784  StructuralConstraintDiGraph >
785  sel_constraint;
786  static_cast< StructuralConstraintIndegree& >(sel_constraint) = constraintIndegree_;
787 
788  GraphChangesSelector4DiGraph< decltype(sel_constraint), decltype(op_set) > selector(
789  *score_,
790  sel_constraint,
791  op_set);
792 
793  return algoK2_.learnStructure(selector, init_graph);
794  } else {
795  StructuralConstraintSetStatic< StructuralConstraintIndegree, StructuralConstraintDAG >
796  sel_constraint;
797  static_cast< StructuralConstraintIndegree& >(sel_constraint) = constraintIndegree_;
798 
799  GraphChangesSelector4DiGraph< decltype(sel_constraint), decltype(op_set) > selector(
800  *score_,
801  sel_constraint,
802  op_set);
803 
804  return algoK2_.learnStructure(selector, init_graph);
805  }
806  }
807 
808  // ========================================================================
809  default:
810  GUM_ERROR(OperationNotAllowed,
811  "the learnDAG method has not been implemented for this "
812  "learning algorithm");
813  }
814  }
MixedGraph prepareMiic3Off2_()
prepares the initial graph for 3off2 or miic
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
Idx pos(const Key &key) const
Returns the position of the object passed in argument (if it exists).
Definition: sequence_tpl.h:499
const ArcSet & arcs() const
returns the set of mandatory arcs
StructuralConstraintPossibleEdges constraintPossibleEdges_
the constraint on possible Edges
Set< Arc > ArcSet
Some typdefs and define for shortcuts ...
StructuralConstraintMandatoryArcs constraintMandatoryArcs_
the constraint on forbidden arcs
const Sequence< NodeId > & order() const noexcept
returns the current order
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
CorrectedMutualInformation * mutualInfo_
the selected correction for 3off2 and miic
bool hasMissingValues() const
indicates whether the database contains some missing values
DAG initialDag_
an initial DAG given to learners
Miic algoMiic3off2_
the MIIC or 3off2 algorithm
const ArcSet & arcs() const
returns the set of mandatory arcs
AlgoType selectedAlgo_
the selected learning algorithm
Database * aprioriDatabase_
the database used by the Dirichlet a priori
const DatabaseTable & databaseTable() const
returns the internal database table
StructuralConstraintTabuList constraintTabuList_
the constraint for tabu lists
StructuralConstraintIndegree constraintIndegree_
the constraint for indegrees
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
Definition: K2_tpl.h:40
AprioriType aprioriType_
the a priori selected for the score and parameters
StructuralConstraintForbiddenArcs constraintForbiddenArcs_
the constraint on forbidden arcs
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
Database scoreDatabase_
the database to be used by the scores and parameter estimators
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm
StructuralConstraintSliceOrder constraintSliceOrder_
the constraint for 2TBNs
DAG learnStructure(CorrectedMutualInformation<> &I, MixedGraph graph)
learns the structure of a Bayesian network, i.e. a DAG, by first learning an Essential graph and then...
Definition: Miic.cpp:768
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
Score * score_
the score used

◆ learnMixedStructure()

MixedGraph gum::learning::genericBNLearner::learnMixedStructure ( )
inherited

learn a partial structure from a file (must have read the db before and must have selected miic or 3off2)

Definition at line 596 of file genericBNLearner.cpp.

596  {
598  GUM_ERROR(OperationNotAllowed, "Must be using the miic/3off2 algorithm")
599  }
600  // check that the database does not contain any missing value
602  GUM_ERROR(MissingValueInDatabase,
603  "For the moment, the BNLearner is unable to learn "
604  << "structures with missing values in databases");
605  }
606  BNLearnerListener listener(this, algoMiic3off2_);
607 
608  // create the mixedGraph_constraint_MandatoryArcs.arcs();
609  MixedGraph mgraph = this->prepareMiic3Off2_();
610 
612  }
MixedGraph prepareMiic3Off2_()
prepares the initial graph for 3off2 or miic
CorrectedMutualInformation * mutualInfo_
the selected correction for 3off2 and miic
bool hasMissingValues() const
indicates whether the database contains some missing values
Miic algoMiic3off2_
the MIIC or 3off2 algorithm
AlgoType selectedAlgo_
the selected learning algorithm
const DatabaseTable & databaseTable() const
returns the internal database table
MixedGraph learnMixedStructure(CorrectedMutualInformation<> &mutualInformation, MixedGraph graph)
learns the structure of an Essential Graph
Definition: Miic.cpp:106
Database scoreDatabase_
the database to be used by the scores and parameter estimators
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ learnParameters() [1/2]

template<typename GUM_SCALAR >
BayesNet< GUM_SCALAR > gum::learning::BNLearner< GUM_SCALAR >::learnParameters ( const DAG dag,
bool  takeIntoAccountScore = true 
)

learns a BN (its parameters) when its structure is known

Parameters
dagthe structure of the Bayesian network
takeIntoAccountScoreThe dag passed in argument may have been learnt from a structure learning. In this case, if the score used to learn the structure has an implicit apriori (like K2 which has a 1-smoothing apriori), it is important to also take into account this implicit apriori for parameter learning. By default, if a score exists, we will learn parameters by taking into account the apriori specified by methods useAprioriXXX () + the implicit apriori of the score, else we just take into account the apriori specified by useAprioriXXX ()

◆ learnParameters() [2/2]

template<typename GUM_SCALAR >
BayesNet< GUM_SCALAR > gum::learning::BNLearner< GUM_SCALAR >::learnParameters ( bool  take_into_account_score = true)
Parameters
take_into_account_scoreThe dag of the BN which was passed in argument to the BNLearner may have been learnt from a structure learning. In this case, if the score used to learn the structure has an implicit apriori (like K2 which has a 1-smoothing apriori), it is important to also take into account this implicit apriori for parameter learning. By default, if a score exists, we will learn parameters by taking into account the apriori specified by methods useAprioriXXX () + the implicit apriori of the score, else we just take into account the apriori specified by useAprioriXXX ()
Exceptions
MissingVariableInDatabaseif a variable of the BN is not found in the database.
UnknownLabelInDatabaseif a label is found in the databast that do not correpond to the variable.

◆ logLikelihood() [1/2]

double gum::learning::genericBNLearner::logLikelihood ( const std::vector< NodeId > &  vars,
const std::vector< NodeId > &  knowing = {} 
)
inherited

Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.

Parameters
varsa vector of NodeIds
knowingan optional vector of conditioning NodeIds
Returns
a std::pair<double,double>

Definition at line 926 of file genericBNLearner.cpp.

927  {
928  createApriori_();
930  *apriori_,
931  databaseRanges());
932 
933  std::vector< NodeId > total(vars);
934  total.insert(total.end(), knowing.begin(), knowing.end());
935  double LLtotal = ll2score.score(IdCondSet<>(total, false, true));
936  if (knowing.size() == (Size)0) {
937  return LLtotal;
938  } else {
939  double LLknw = ll2score.score(IdCondSet<>(knowing, false, true));
940  return LLtotal - LLknw;
941  }
942  }
the class for computing Log2-likelihood scores
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
void createApriori_()
create the apriori used for learning
DBRowGeneratorParser & parser()
returns the parser for the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
Apriori * apriori_
the apriori used
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:47

◆ logLikelihood() [2/2]

double gum::learning::genericBNLearner::logLikelihood ( const std::vector< std::string > &  vars,
const std::vector< std::string > &  knowing = {} 
)
inherited

Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.

Parameters
varsa vector of name of rows
knowingan optional vector of conditioning rows
Returns
a std::pair<double,double>

Definition at line 944 of file genericBNLearner.cpp.

945  {
946  std::vector< NodeId > ids;
947  std::vector< NodeId > knowingIds;
948 
949  auto mapper = [this](const std::string& c) -> NodeId {
950  return this->idFromName(c);
951  };
952 
953  std::transform(vars.begin(), vars.end(), std::back_inserter(ids), mapper);
954  std::transform(knowing.begin(), knowing.end(), std::back_inserter(knowingIds), mapper);
955 
956  return logLikelihood(ids, knowingIds);
957  }
double logLikelihood(const std::vector< NodeId > &vars, const std::vector< NodeId > &knowing={})
Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Size NodeId
Type for node ids.
Definition: graphElements.h:97

◆ maxIter()

Size gum::learning::genericBNLearner::maxIter ( ) const
inlinevirtualinherited
Returns
the criterion on number of iterations

Implements gum::IApproximationSchemeConfiguration.

Definition at line 995 of file genericBNLearner.h.

995  {
996  if (currentAlgorithm_ != nullptr)
997  return currentAlgorithm_->maxIter();
998  else
999  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1000  }
const ApproximationScheme * currentAlgorithm_
Size maxIter() const
Returns the criterion on number of iterations.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ maxTime()

double gum::learning::genericBNLearner::maxTime ( ) const
inlinevirtualinherited

returns the timeout (in seconds)

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1040 of file genericBNLearner.h.

1040  {
1041  if (currentAlgorithm_ != nullptr)
1042  return currentAlgorithm_->maxTime();
1043  else
1044  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1045  }
double maxTime() const
Returns the timeout (in seconds).
const ApproximationScheme * currentAlgorithm_
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ messageApproximationScheme()

INLINE std::string gum::IApproximationSchemeConfiguration::messageApproximationScheme ( ) const
inherited

Returns the approximation scheme message.

Returns
Returns the approximation scheme message.

Definition at line 38 of file IApproximationSchemeConfiguration_inl.h.

References gum::Set< Key, Alloc >::emplace().

38  {
39  std::stringstream s;
40 
41  switch (stateApproximationScheme()) {
43  s << "in progress";
44  break;
45 
47  s << "stopped with epsilon=" << epsilon();
48  break;
49 
51  s << "stopped with rate=" << minEpsilonRate();
52  break;
53 
55  s << "stopped with max iteration=" << maxIter();
56  break;
57 
59  s << "stopped with timeout=" << maxTime();
60  break;
61 
63  s << "stopped on request";
64  break;
65 
67  s << "undefined state";
68  break;
69  };
70 
71  return s.str();
72  }
virtual double epsilon() const =0
Returns the value of epsilon.
virtual ApproximationSchemeSTATE stateApproximationScheme() const =0
Returns the approximation scheme state.
virtual double maxTime() const =0
Returns the timeout (in seconds).
virtual Size maxIter() const =0
Returns the criterion on number of iterations.
virtual double minEpsilonRate() const =0
Returns the value of the minimal epsilon rate.
+ Here is the call graph for this function:

◆ minEpsilonRate()

double gum::learning::genericBNLearner::minEpsilonRate ( ) const
inlinevirtualinherited

Get the value of the minimal epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 951 of file genericBNLearner.h.

951  {
952  if (currentAlgorithm_ != nullptr)
954  else
955  GUM_ERROR(FatalError, "No chosen algorithm for learning")
956  }
double minEpsilonRate() const
Returns the value of the minimal epsilon rate.
const ApproximationScheme * currentAlgorithm_
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ nameFromId()

INLINE const std::string & gum::learning::genericBNLearner::nameFromId ( NodeId  id) const
inherited

returns the variable name corresponding to a given node id

Definition at line 133 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

133  {
134  return scoreDatabase_.nameFromId(id);
135  }
const std::string & nameFromId(NodeId id) const
returns the variable name corresponding to a given node id
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ names()

INLINE const std::vector< std::string > & gum::learning::genericBNLearner::names ( ) const
inherited

returns the names of the variables in the database

Definition at line 466 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

466  {
467  return scoreDatabase_.names();
468  }
Database scoreDatabase_
the database to be used by the scores and parameter estimators
const std::vector< std::string > & names() const
returns the names of the variables in the database
+ Here is the call graph for this function:

◆ nbCols()

INLINE Size gum::learning::genericBNLearner::nbCols ( ) const
inherited
Returns
the number of cols in the database

Definition at line 498 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

498 { return scoreDatabase_.domainSizes().size(); }
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ nbrIterations()

Size gum::learning::genericBNLearner::nbrIterations ( ) const
inlinevirtualinherited
Exceptions
OperationNotAllowedif scheme not performed

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1124 of file genericBNLearner.h.

1124  {
1125  if (currentAlgorithm_ != nullptr)
1126  return currentAlgorithm_->nbrIterations();
1127  else
1128  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1129  }
Size nbrIterations() const
Returns the number of iterations.
const ApproximationScheme * currentAlgorithm_
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ nbRows()

INLINE Size gum::learning::genericBNLearner::nbRows ( ) const
inherited
Returns
the number of rows in the database

Definition at line 500 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

500 { return scoreDatabase_.databaseTable().size(); }
std::size_t size() const noexcept
returns the number of records (rows) in the database
const DatabaseTable & databaseTable() const
returns the internal database table
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ operator=() [1/2]

template<typename GUM_SCALAR >
BNLearner& gum::learning::BNLearner< GUM_SCALAR >::operator= ( const BNLearner< GUM_SCALAR > &  )

copy operator

◆ operator=() [2/2]

template<typename GUM_SCALAR >
BNLearner& gum::learning::BNLearner< GUM_SCALAR >::operator= ( BNLearner< GUM_SCALAR > &&  )

move operator

◆ periodSize()

Size gum::learning::genericBNLearner::periodSize ( ) const
inlinevirtualinherited

how many samples between 2 stopping isEnableds

Exceptions
OutOfLowerBoundif p<1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1088 of file genericBNLearner.h.

1088  {
1089  if (currentAlgorithm_ != nullptr)
1090  return currentAlgorithm_->periodSize();
1091  else
1092  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1093  }
Size periodSize() const
Returns the period size.
const ApproximationScheme * currentAlgorithm_
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ prepareMiic3Off2_()

MixedGraph gum::learning::genericBNLearner::prepareMiic3Off2_ ( )
protectedinherited

prepares the initial graph for 3off2 or miic

Definition at line 566 of file genericBNLearner.cpp.

566  {
567  // Initialize the mixed graph to the fully connected graph
568  MixedGraph mgraph;
569  for (Size i = 0; i < scoreDatabase_.databaseTable().nbVariables(); ++i) {
570  mgraph.addNodeWithId(i);
571  for (Size j = 0; j < i; ++j) {
572  mgraph.addEdge(j, i);
573  }
574  }
575 
576  // translating the constraints for 3off2 or miic
577  HashTable< std::pair< NodeId, NodeId >, char > initial_marks;
578  const ArcSet& mandatory_arcs = constraintMandatoryArcs_.arcs();
579  for (const auto& arc: mandatory_arcs) {
580  initial_marks.insert({arc.tail(), arc.head()}, '>');
581  }
582 
583  const ArcSet& forbidden_arcs = constraintForbiddenArcs_.arcs();
584  for (const auto& arc: forbidden_arcs) {
585  initial_marks.insert({arc.tail(), arc.head()}, '-');
586  }
587  algoMiic3off2_.addConstraints(initial_marks);
588 
589  // create the mutual entropy object
590  // if ( _mutual_info_ == nullptr) { this->useNMLCorrection(); }
592 
593  return mgraph;
594  }
const ArcSet & arcs() const
returns the set of mandatory arcs
void addConstraints(HashTable< std::pair< NodeId, NodeId >, char > constraints)
Set a ensemble of constraints for the orientation phase.
Definition: Miic.cpp:956
Set< Arc > ArcSet
Some typdefs and define for shortcuts ...
StructuralConstraintMandatoryArcs constraintMandatoryArcs_
the constraint on forbidden arcs
std::size_t nbVariables() const noexcept
returns the number of variables (columns) of the database
Miic algoMiic3off2_
the MIIC or 3off2 algorithm
const ArcSet & arcs() const
returns the set of mandatory arcs
const DatabaseTable & databaseTable() const
returns the internal database table
StructuralConstraintForbiddenArcs constraintForbiddenArcs_
the constraint on forbidden arcs
Database scoreDatabase_
the database to be used by the scores and parameter estimators
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:47
void createCorrectedMutualInformation_()
create the Corrected Mutual Information instance for Miic/3off2
void insert(const Key &k)
Inserts a new element into the set.
Definition: set_tpl.h:606

◆ rawPseudoCount() [1/2]

std::vector< double > gum::learning::genericBNLearner::rawPseudoCount ( const std::vector< NodeId > &  vars)
inherited

Return the pseudoconts ofNodeIds vars in the base in a raw array.

Parameters
varsa vector of
Returns
a a std::vector<double> containing the contingency table

Definition at line 959 of file genericBNLearner.cpp.

959  {
960  Potential< double > res;
961 
962  createApriori_();
964  return count.get(vars);
965  }
The class for giving access to pseudo count : count in the database + prior.
Definition: pseudoCount.h:52
std::vector< double, ALLOC< double > > get(const std::vector< NodeId, ALLOC< NodeId > > &ids)
returns the pseudo-count of a pair of nodes given some other nodes
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
void createApriori_()
create the apriori used for learning
DBRowGeneratorParser & parser()
returns the parser for the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
Apriori * apriori_
the apriori used

◆ rawPseudoCount() [2/2]

std::vector< double > gum::learning::genericBNLearner::rawPseudoCount ( const std::vector< std::string > &  vars)
inherited

Return the pseudoconts of vars in the base in a raw array.

Parameters
varsa vector of name
Returns
a std::vector<double> containing the contingency table

Definition at line 968 of file genericBNLearner.cpp.

968  {
969  std::vector< NodeId > ids;
970 
971  auto mapper = [this](const std::string& c) -> NodeId {
972  return this->idFromName(c);
973  };
974 
975  std::transform(vars.begin(), vars.end(), std::back_inserter(ids), mapper);
976 
977  return rawPseudoCount(ids);
978  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
std::vector< double > rawPseudoCount(const std::vector< NodeId > &vars)
Return the pseudoconts ofNodeIds vars in the base in a raw array.
Size NodeId
Type for node ids.
Definition: graphElements.h:97

◆ readFile_()

DatabaseTable gum::learning::genericBNLearner::readFile_ ( const std::string &  filename,
const std::vector< std::string > &  missing_symbols 
)
staticprotectedinherited

reads a file and returns a databaseVectInRam

Definition at line 402 of file genericBNLearner.cpp.

403  {
404  // get the extension of the file
405  checkFileName_(filename);
406 
407  DBInitializerFromCSV<> initializer(filename);
408 
409  const auto& var_names = initializer.variableNames();
410  const std::size_t nb_vars = var_names.size();
411 
412  DBTranslatorSet<> translator_set;
413  DBTranslator4LabelizedVariable<> translator(missing_symbols);
414  for (std::size_t i = 0; i < nb_vars; ++i) {
415  translator_set.insertTranslator(translator, i);
416  }
417 
418  DatabaseTable<> database(missing_symbols, translator_set);
419  database.setVariableNames(initializer.variableNames());
420  initializer.fillDatabase(database);
421 
422  database.reorder();
423 
424  return database;
425  }
static void checkFileName_(const std::string &filename)
checks whether the extension of a CSV filename is correct
virtual void setVariableNames(const std::vector< std::string, ALLOC< std::string > > &names, const bool from_external_object=true) final
sets the names of the variables
const DatabaseTable & database() const
returns the database used by the BNLearner
void reorder(const std::size_t k, const bool k_is_input_col=false)
performs a reordering of the kth translator or of the first translator parsing the kth column of the ...

◆ recordWeight()

INLINE double gum::learning::genericBNLearner::recordWeight ( const std::size_t  i) const
inherited

returns the weight of the ith record

Exceptions
OutOfBoundsif i is outside the set of indices of the records

Definition at line 148 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

148  {
149  return scoreDatabase_.weight(i);
150  }
double weight(const std::size_t i) const
returns the weight of the ith record
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ setCurrentApproximationScheme()

INLINE void gum::learning::genericBNLearner::setCurrentApproximationScheme ( const ApproximationScheme approximationScheme)
inlineinherited

{@ /// distribute signals

Definition at line 871 of file genericBNLearner.h.

871  {
872  currentAlgorithm_ = approximationScheme;
873  }
const ApproximationScheme * currentAlgorithm_

◆ setDatabaseWeight()

INLINE void gum::learning::genericBNLearner::setDatabaseWeight ( const double  new_weight)
inherited

assign a weight to all the rows of the learning database so that the sum of their weights is equal to new_weight

assign new weight to the rows of the learning database

Definition at line 138 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

138  {
139  scoreDatabase_.setDatabaseWeight(new_weight);
140  }
void setDatabaseWeight(const double new_weight)
assign a weight to all the rows of the database so that the sum of their weights is equal to new_weig...
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ setEpsilon()

void gum::learning::genericBNLearner::setEpsilon ( double  eps)
inlinevirtualinherited

Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)| If the criterion was disabled it will be enabled.

Exceptions
OutOfLowerBoundif eps<0

Implements gum::IApproximationSchemeConfiguration.

Definition at line 897 of file genericBNLearner.h.

897  {
901  Dag2BN_.setEpsilon(eps);
902  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
void setEpsilon(double eps)
Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)|.
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ setForbiddenArcs()

INLINE void gum::learning::genericBNLearner::setForbiddenArcs ( const ArcSet set)
inherited

assign a set of forbidden arcs

Definition at line 305 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

305  {
307  }
void setArcs(const ArcSet &set)
assign a set of forbidden arcs
StructuralConstraintForbiddenArcs constraintForbiddenArcs_
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ setInitialDAG()

INLINE void gum::learning::genericBNLearner::setInitialDAG ( const DAG dag)
inherited

sets an initial DAG structure

Definition at line 156 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

156 { initialDag_ = dag; }
DAG initialDag_
an initial DAG given to learners
+ Here is the call graph for this function:

◆ setMandatoryArcs()

INLINE void gum::learning::genericBNLearner::setMandatoryArcs ( const ArcSet set)
inherited

assign a set of forbidden arcs

Definition at line 342 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

342  {
344  }
void setArcs(const ArcSet &set)
assign a set of forbidden arcs
StructuralConstraintMandatoryArcs constraintMandatoryArcs_
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ setMaxIndegree()

INLINE void gum::learning::genericBNLearner::setMaxIndegree ( Size  max_indegree)
inherited

sets the max indegree

Definition at line 195 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

195  {
196  constraintIndegree_.setMaxIndegree(max_indegree);
197  }
void setMaxIndegree(Size max_indegree, bool update_all_node=false)
resets the default max indegree and possibly updates the indegree of all nodes
StructuralConstraintIndegree constraintIndegree_
the constraint for indegrees
+ Here is the call graph for this function:

◆ setMaxIter()

void gum::learning::genericBNLearner::setMaxIter ( Size  max)
inlinevirtualinherited

stopping criterion on number of iterationsIf the criterion was disabled it will be enabled

Parameters
maxThe maximum number of iterations
Exceptions
OutOfLowerBoundif max<=1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 987 of file genericBNLearner.h.

987  {
991  Dag2BN_.setMaxIter(max);
992  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
void setMaxIter(Size max)
Stopping criterion on number of iterations.
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ setMaxTime()

void gum::learning::genericBNLearner::setMaxTime ( double  timeout)
inlinevirtualinherited

stopping criterion on timeout If the criterion was disabled it will be enabled

Exceptions
OutOfLowerBoundif timeout<=0.0 timeout is time in second (double).

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1032 of file genericBNLearner.h.

1032  {
1036  Dag2BN_.setMaxTime(timeout);
1037  }
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setMaxTime(double timeout)
Stopping criterion on timeout.
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ setMinEpsilonRate()

void gum::learning::genericBNLearner::setMinEpsilonRate ( double  rate)
inlinevirtualinherited

Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|) If the criterion was disabled it will be enabled.

Exceptions
OutOfLowerBoundif rate<0

Implements gum::IApproximationSchemeConfiguration.

Definition at line 943 of file genericBNLearner.h.

943  {
948  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setMinEpsilonRate(double rate)
Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|).
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ setPeriodSize()

void gum::learning::genericBNLearner::setPeriodSize ( Size  p)
inlinevirtualinherited

how many samples between 2 stopping isEnableds

Exceptions
OutOfLowerBoundif p<1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1081 of file genericBNLearner.h.

1081  {
1086  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setPeriodSize(Size p)
How many samples between two stopping is enable.
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ setPossibleEdges()

INLINE void gum::learning::genericBNLearner::setPossibleEdges ( const EdgeSet set)
inherited

assign a set of forbidden edges

Warning
Once at least one possible edge is defined, all other edges are not possible anymore

Definition at line 264 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

264  {
266  }
void setEdges(const EdgeSet &set)
assign a set of forbidden arcs
StructuralConstraintPossibleEdges constraintPossibleEdges_
the constraint on possible Edges
+ Here is the call graph for this function:

◆ setPossibleSkeleton()

INLINE void gum::learning::genericBNLearner::setPossibleSkeleton ( const UndiGraph skeleton)
inherited

assign a set of forbidden edges

Warning
Once at least one possible edge is defined, all other edges are not possible anymore

Definition at line 268 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

268  {
269  setPossibleEdges(g.edges());
270  }
void setPossibleEdges(const EdgeSet &set)
assign a set of forbidden edges
+ Here is the call graph for this function:

◆ setRecordWeight()

INLINE void gum::learning::genericBNLearner::setRecordWeight ( const std::size_t  i,
const double  weight 
)
inherited

sets the weight of the ith record of the database

assign new weight to the ith row of the learning database

Exceptions
OutOfBoundsif i is outside the set of indices of the records or if the weight is negative

Definition at line 143 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

143  {
144  scoreDatabase_.setWeight(i, new_weight);
145  }
void setWeight(const std::size_t i, const double weight)
sets the weight of the ith record
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ setSliceOrder() [1/2]

INLINE void gum::learning::genericBNLearner::setSliceOrder ( const NodeProperty< NodeId > &  slice_order)
inherited

sets a partial order on the nodes

Parameters
slice_ordera NodeProperty given the rank (priority) of nodes in the partial order

Definition at line 379 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

379  {
380  constraintSliceOrder_ = StructuralConstraintSliceOrder(slice_order);
381  }
StructuralConstraintSliceOrder constraintSliceOrder_
the constraint for 2TBNs
+ Here is the call graph for this function:

◆ setSliceOrder() [2/2]

INLINE void gum::learning::genericBNLearner::setSliceOrder ( const std::vector< std::vector< std::string > > &  slices)
inherited

sets a partial order on the nodes

Parameters
slicesthe list of list of variable names

Definition at line 384 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

384  {
385  NodeProperty< NodeId > slice_order;
386  NodeId rank = 0;
387  for (const auto& slice: slices) {
388  for (const auto& name: slice) {
389  slice_order.insert(idFromName(name), rank);
390  }
391  rank++;
392  }
393  setSliceOrder(slice_order);
394  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
void setSliceOrder(const NodeProperty< NodeId > &slice_order)
sets a partial order on the nodes
Size NodeId
Type for node ids.
Definition: graphElements.h:97
+ Here is the call graph for this function:

◆ setVerbosity()

void gum::learning::genericBNLearner::setVerbosity ( bool  v)
inlinevirtualinherited

verbosity

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1098 of file genericBNLearner.h.

1098  {
1102  Dag2BN_.setVerbosity(v);
1103  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setVerbosity(bool v)
Set the verbosity on (true) or off (false).
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ stateApproximationScheme()

ApproximationSchemeSTATE gum::learning::genericBNLearner::stateApproximationScheme ( ) const
inlinevirtualinherited

history

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1116 of file genericBNLearner.h.

1116  {
1117  if (currentAlgorithm_ != nullptr)
1119  else
1120  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1121  }
const ApproximationScheme * currentAlgorithm_
ApproximationSchemeSTATE stateApproximationScheme() const
Returns the approximation scheme state.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ use3off2()

INLINE void gum::learning::genericBNLearner::use3off2 ( )
inherited

indicate that we wish to use 3off2

Definition at line 200 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

200  {
203  }
void set3of2Behaviour()
Sets the orientation phase to follow the one of the 3off2 algorithm.
Definition: Miic.cpp:954
Miic algoMiic3off2_
the MIIC or 3off2 algorithm
AlgoType selectedAlgo_
the selected learning algorithm
+ Here is the call graph for this function:

◆ useAprioriBDeu()

INLINE void gum::learning::genericBNLearner::useAprioriBDeu ( double  weight = 1)
inherited

use the BDeu apriori

The BDeu apriori adds weight to all the cells of the countings tables. In other words, it adds weight rows in the database with equally probable values.

Definition at line 433 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

433  {
434  if (weight < 0) { GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive") }
435 
437  _setAprioriWeight_(weight);
438 
440  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
void _setAprioriWeight_(double weight)
sets the apriori weight
AprioriType aprioriType_
the a priori selected for the score and parameters
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
+ Here is the call graph for this function:

◆ useAprioriDirichlet()

INLINE void gum::learning::genericBNLearner::useAprioriDirichlet ( const std::string &  filename,
double  weight = 1 
)
inherited

use the Dirichlet apriori

Definition at line 421 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

421  {
422  if (weight < 0) { GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive") }
423 
424  aprioriDbname_ = filename;
426  _setAprioriWeight_(weight);
427 
429  }
std::string aprioriDbname_
the filename for the Dirichlet a priori, if any
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
void _setAprioriWeight_(double weight)
sets the apriori weight
AprioriType aprioriType_
the a priori selected for the score and parameters
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
+ Here is the call graph for this function:

◆ useAprioriSmoothing()

INLINE void gum::learning::genericBNLearner::useAprioriSmoothing ( double  weight = 1)
inherited

use the apriori smoothing

Parameters
weightpass in argument a weight if you wish to assign a weight to the smoothing, else the current weight of the genericBNLearner will be used.

Definition at line 411 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

411  {
412  if (weight < 0) { GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive") }
413 
415  _setAprioriWeight_(weight);
416 
418  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
void _setAprioriWeight_(double weight)
sets the apriori weight
AprioriType aprioriType_
the a priori selected for the score and parameters
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
+ Here is the call graph for this function:

◆ useCrossValidationFold()

std::pair< std::size_t, std::size_t > gum::learning::genericBNLearner::useCrossValidationFold ( const std::size_t  learning_fold,
const std::size_t  k_fold 
)
inherited

sets the ranges of rows to be used for cross-validation learning

When applied on (x,k), the method indicates to the subsequent learnings that they should be performed on the xth fold in a k-fold cross-validation context. For instance, if a database has 1000 rows, and if we perform a 10-fold cross-validation, then, the first learning fold (learning_fold=0) corresponds to rows interval [100,1000) and the test dataset corresponds to [0,100). The second learning fold (learning_fold=1) is [0,100) U [200,1000) and the corresponding test dataset is [100,200).

Parameters
learning_folda number indicating the set of rows used for learning. If N denotes the size of the database, and k_fold represents the number of folds in the cross validation, then the set of rows used for testing is [learning_fold * N / k_fold, (learning_fold+1) * N / k_fold) and the learning database is the complement in the database
k_foldthe value of "k" in k-fold cross validation
Returns
a pair [x,y) of rows' indices that corresponds to the indices of rows in the original database that constitute the test dataset
Exceptions
OutOfBoundsis raised if k_fold is equal to 0 or learning_fold is greater than or eqal to k_fold, or if k_fold is greater than or equal to the size of the database.

Definition at line 846 of file genericBNLearner.cpp.

847  {
848  if (k_fold == 0) { GUM_ERROR(OutOfBounds, "K-fold cross validation with k=0 is forbidden") }
849 
850  if (learning_fold >= k_fold) {
851  GUM_ERROR(OutOfBounds,
852  "In " << k_fold << "-fold cross validation, the learning "
853  << "fold should be strictly lower than " << k_fold
854  << " but, here, it is equal to " << learning_fold);
855  }
856 
857  const std::size_t db_size = scoreDatabase_.databaseTable().nbRows();
858  if (k_fold >= db_size) {
859  GUM_ERROR(OutOfBounds,
860  "In " << k_fold << "-fold cross validation, the database's "
861  << "size should be strictly greater than " << k_fold
862  << " but, here, the database has only " << db_size << "rows");
863  }
864 
865  // create the ranges of rows of the test database
866  const std::size_t foldSize = db_size / k_fold;
867  const std::size_t unfold_deb = learning_fold * foldSize;
868  const std::size_t unfold_end = unfold_deb + foldSize;
869 
870  ranges_.clear();
871  if (learning_fold == std::size_t(0)) {
872  ranges_.push_back(std::pair< std::size_t, std::size_t >(unfold_end, db_size));
873  } else {
874  ranges_.push_back(std::pair< std::size_t, std::size_t >(std::size_t(0), unfold_deb));
875 
876  if (learning_fold != k_fold - 1) {
877  ranges_.push_back(std::pair< std::size_t, std::size_t >(unfold_end, db_size));
878  }
879  }
880 
881  return std::pair< std::size_t, std::size_t >(unfold_deb, unfold_end);
882  }
std::vector< std::pair< std::size_t, std::size_t > > ranges_
the set of rows&#39; ranges within the database in which learning is done
std::size_t nbRows() const noexcept
returns the number of records (rows) in the database
const DatabaseTable & databaseTable() const
returns the internal database table
Database scoreDatabase_
the database to be used by the scores and parameter estimators
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ useDatabaseRanges()

template<template< typename > class XALLOC>
void gum::learning::genericBNLearner::useDatabaseRanges ( const std::vector< std::pair< std::size_t, std::size_t >, XALLOC< std::pair< std::size_t, std::size_t > > > &  new_ranges)
inherited

use a new set of database rows' ranges to perform learning

Parameters
rangesa set of pairs {(X1,Y1),...,(Xn,Yn)} of database's rows indices. The subsequent learnings are then performed only on the union of the rows [Xi,Yi), i in {1,...,n}. This is useful, e.g, when performing cross validation tasks, in which part of the database should be ignored. An empty set of ranges is equivalent to an interval [X,Y) ranging over the whole database.

Definition at line 97 of file genericBNLearner_tpl.h.

References gum::learning::genericBNLearner::Database::Database().

99  {
100  // use a score to detect whether the ranges are ok
101  ScoreLog2Likelihood<> score(scoreDatabase_.parser(), *noApriori_);
102  score.setRanges(new_ranges);
103  ranges_ = score.ranges();
104  }
std::vector< std::pair< std::size_t, std::size_t > > ranges_
the set of rows&#39; ranges within the database in which learning is done
DBRowGeneratorParser & parser()
returns the parser for the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ useEM()

INLINE void gum::learning::genericBNLearner::useEM ( const double  epsilon)
inherited

use The EM algorithm to learn paramters

if epsilon=0, EM is not used

Definition at line 256 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

256 { epsilonEM_ = epsilon; }
double epsilon() const
Get the value of epsilon.
double epsilonEM_
epsilon for EM. if espilon=0.0 : no EM
+ Here is the call graph for this function:

◆ useGreedyHillClimbing()

INLINE void gum::learning::genericBNLearner::useGreedyHillClimbing ( )
inherited

indicate that we wish to use a greedy hill climbing algorithm

Definition at line 244 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

+ Here is the call graph for this function:

◆ useK2() [1/2]

INLINE void gum::learning::genericBNLearner::useK2 ( const Sequence< NodeId > &  order)
inherited

indicate that we wish to use K2

Definition at line 232 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

232  {
234  algoK2_.setOrder(order);
235  }
void setOrder(const Sequence< NodeId > &order)
sets the order on the variables
AlgoType selectedAlgo_
the selected learning algorithm
+ Here is the call graph for this function:

◆ useK2() [2/2]

INLINE void gum::learning::genericBNLearner::useK2 ( const std::vector< NodeId > &  order)
inherited

indicate that we wish to use K2

Definition at line 238 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

238  {
240  algoK2_.setOrder(order);
241  }
void setOrder(const Sequence< NodeId > &order)
sets the order on the variables
AlgoType selectedAlgo_
the selected learning algorithm
+ Here is the call graph for this function:

◆ useLocalSearchWithTabuList()

INLINE void gum::learning::genericBNLearner::useLocalSearchWithTabuList ( Size  tabu_size = 100,
Size  nb_decrease = 2 
)
inherited

indicate that we wish to use a local search with tabu list

Parameters
tabu_sizeindicate the size of the tabu list
nb_decreaseindicate the max number of changes decreasing the score consecutively that we allow to apply

Definition at line 249 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

249  {
253  }
void setTabuListSize(Size new_size)
sets the size of the tabu list
AlgoType selectedAlgo_
the selected learning algorithm
StructuralConstraintTabuList constraintTabuList_
the constraint for tabu lists
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm
void setMaxNbDecreasingChanges(Size nb)
set the max number of changes decreasing the score that we allow to apply
+ Here is the call graph for this function:

◆ useMDLCorrection()

INLINE void gum::learning::genericBNLearner::useMDLCorrection ( )
inherited

indicate that we wish to use the MDL correction for 3off2 and MIIC

indicate that we wish to use the MDL correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 217 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

+ Here is the call graph for this function:

◆ useMIIC()

INLINE void gum::learning::genericBNLearner::useMIIC ( )
inherited

indicate that we wish to use MIIC

Definition at line 206 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

206  {
209  }
void setMiicBehaviour()
Sets the orientation phase to follow the one of the MIIC algorithm.
Definition: Miic.cpp:952
Miic algoMiic3off2_
the MIIC or 3off2 algorithm
AlgoType selectedAlgo_
the selected learning algorithm
+ Here is the call graph for this function:

◆ useNMLCorrection()

INLINE void gum::learning::genericBNLearner::useNMLCorrection ( )
inherited

indicate that we wish to use the NML correction for 3off2 and MIIC

indicate that we wish to use the NML correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 212 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

+ Here is the call graph for this function:

◆ useNoApriori()

INLINE void gum::learning::genericBNLearner::useNoApriori ( )
inherited

use no apriori

Definition at line 405 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

405  {
408  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
AprioriType aprioriType_
the a priori selected for the score and parameters
+ Here is the call graph for this function:

◆ useNoCorrection()

INLINE void gum::learning::genericBNLearner::useNoCorrection ( )
inherited

indicate that we wish to use the NoCorr correction for 3off2 and MIIC

indicate that we wish to use the NoCorr correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 222 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

+ Here is the call graph for this function:

◆ useScoreAIC()

INLINE void gum::learning::genericBNLearner::useScoreAIC ( )
inherited

indicate that we wish to use an AIC score

Definition at line 159 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

159  {
162  }
ScoreType scoreType_
the score selected for learning
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreBD()

INLINE void gum::learning::genericBNLearner::useScoreBD ( )
inherited

indicate that we wish to use a BD score

Definition at line 165 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

165  {
168  }
ScoreType scoreType_
the score selected for learning
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreBDeu()

INLINE void gum::learning::genericBNLearner::useScoreBDeu ( )
inherited

indicate that we wish to use a BDeu score

Definition at line 171 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

171  {
174  }
ScoreType scoreType_
the score selected for learning
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreBIC()

INLINE void gum::learning::genericBNLearner::useScoreBIC ( )
inherited

indicate that we wish to use a BIC score

Definition at line 177 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

177  {
180  }
ScoreType scoreType_
the score selected for learning
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreK2()

INLINE void gum::learning::genericBNLearner::useScoreK2 ( )
inherited

indicate that we wish to use a K2 score

Definition at line 183 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

183  {
186  }
ScoreType scoreType_
the score selected for learning
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreLog2Likelihood()

INLINE void gum::learning::genericBNLearner::useScoreLog2Likelihood ( )
inherited

indicate that we wish to use a Log2Likelihood score

Definition at line 189 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

189  {
192  }
ScoreType scoreType_
the score selected for learning
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ verbosity()

bool gum::learning::genericBNLearner::verbosity ( ) const
inlinevirtualinherited

verbosity

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1105 of file genericBNLearner.h.

1105  {
1106  if (currentAlgorithm_ != nullptr)
1107  return currentAlgorithm_->verbosity();
1108  else
1109  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1110  }
const ApproximationScheme * currentAlgorithm_
bool verbosity() const
Returns true if verbosity is enabled.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

Member Data Documentation

◆ algoK2_

K2 gum::learning::genericBNLearner::algoK2_
protectedinherited

the K2 algorithm

Definition at line 795 of file genericBNLearner.h.

◆ algoMiic3off2_

Miic gum::learning::genericBNLearner::algoMiic3off2_
protectedinherited

the MIIC or 3off2 algorithm

Definition at line 798 of file genericBNLearner.h.

◆ apriori_

Apriori* gum::learning::genericBNLearner::apriori_ {nullptr}
protectedinherited

the apriori used

Definition at line 766 of file genericBNLearner.h.

◆ aprioriDatabase_

Database* gum::learning::genericBNLearner::aprioriDatabase_ {nullptr}
protectedinherited

the database used by the Dirichlet a priori

Definition at line 820 of file genericBNLearner.h.

◆ aprioriDbname_

std::string gum::learning::genericBNLearner::aprioriDbname_
protectedinherited

the filename for the Dirichlet a priori, if any

Definition at line 823 of file genericBNLearner.h.

◆ aprioriType_

AprioriType gum::learning::genericBNLearner::aprioriType_ {AprioriType::NO_APRIORI}
protectedinherited

the a priori selected for the score and parameters

Definition at line 763 of file genericBNLearner.h.

◆ aprioriWeight_

double gum::learning::genericBNLearner::aprioriWeight_ {1.0f}
protectedinherited

the weight of the apriori

Definition at line 771 of file genericBNLearner.h.

◆ constraintForbiddenArcs_

StructuralConstraintForbiddenArcs gum::learning::genericBNLearner::constraintForbiddenArcs_
protectedinherited

the constraint on forbidden arcs

Definition at line 783 of file genericBNLearner.h.

◆ constraintIndegree_

StructuralConstraintIndegree gum::learning::genericBNLearner::constraintIndegree_
protectedinherited

the constraint for indegrees

Definition at line 777 of file genericBNLearner.h.

◆ constraintMandatoryArcs_

StructuralConstraintMandatoryArcs gum::learning::genericBNLearner::constraintMandatoryArcs_
protectedinherited

the constraint on forbidden arcs

Definition at line 789 of file genericBNLearner.h.

◆ constraintPossibleEdges_

StructuralConstraintPossibleEdges gum::learning::genericBNLearner::constraintPossibleEdges_
protectedinherited

the constraint on possible Edges

Definition at line 786 of file genericBNLearner.h.

◆ constraintSliceOrder_

StructuralConstraintSliceOrder gum::learning::genericBNLearner::constraintSliceOrder_
protectedinherited

the constraint for 2TBNs

Definition at line 774 of file genericBNLearner.h.

◆ constraintTabuList_

StructuralConstraintTabuList gum::learning::genericBNLearner::constraintTabuList_
protectedinherited

the constraint for tabu lists

Definition at line 780 of file genericBNLearner.h.

◆ currentAlgorithm_

const ApproximationScheme* gum::learning::genericBNLearner::currentAlgorithm_ {nullptr}
protectedinherited

Definition at line 829 of file genericBNLearner.h.

◆ Dag2BN_

DAG2BNLearner gum::learning::genericBNLearner::Dag2BN_
protectedinherited

the parametric EM

Definition at line 805 of file genericBNLearner.h.

◆ epsilonEM_

double gum::learning::genericBNLearner::epsilonEM_ {0.0}
protectedinherited

epsilon for EM. if espilon=0.0 : no EM

Definition at line 757 of file genericBNLearner.h.

◆ greedyHillClimbing_

GreedyHillClimbing gum::learning::genericBNLearner::greedyHillClimbing_
protectedinherited

the greedy hill climbing algorithm

Definition at line 808 of file genericBNLearner.h.

◆ initialDag_

DAG gum::learning::genericBNLearner::initialDag_
protectedinherited

an initial DAG given to learners

Definition at line 826 of file genericBNLearner.h.

◆ kmode3Off2_

CorrectedMutualInformation ::KModeTypes gum::learning::genericBNLearner::kmode3Off2_
protectedinherited
Initial value:

the penalty used in 3off2

Definition at line 801 of file genericBNLearner.h.

◆ localSearchWithTabuList_

LocalSearchWithTabuList gum::learning::genericBNLearner::localSearchWithTabuList_
protectedinherited

the local search with tabu list algorithm

Definition at line 811 of file genericBNLearner.h.

◆ mutualInfo_

CorrectedMutualInformation* gum::learning::genericBNLearner::mutualInfo_ {nullptr}
protectedinherited

the selected correction for 3off2 and miic

Definition at line 760 of file genericBNLearner.h.

◆ noApriori_

AprioriNoApriori* gum::learning::genericBNLearner::noApriori_ {nullptr}
protectedinherited

Definition at line 768 of file genericBNLearner.h.

◆ onProgress

Signaler3< Size, double, double > gum::IApproximationSchemeConfiguration::onProgress
inherited

Progression, error and time.

Definition at line 58 of file IApproximationSchemeConfiguration.h.

◆ onStop

Signaler1< std::string > gum::IApproximationSchemeConfiguration::onStop
inherited

Criteria messageApproximationScheme.

Definition at line 61 of file IApproximationSchemeConfiguration.h.

◆ paramEstimatorType_

ParamEstimatorType gum::learning::genericBNLearner::paramEstimatorType_ {ParamEstimatorType::ML}
protectedinherited

the type of the parameter estimator

Definition at line 754 of file genericBNLearner.h.

◆ ranges_

std::vector< std::pair< std::size_t, std::size_t > > gum::learning::genericBNLearner::ranges_
protectedinherited

the set of rows' ranges within the database in which learning is done

Definition at line 817 of file genericBNLearner.h.

◆ score_

Score* gum::learning::genericBNLearner::score_ {nullptr}
protectedinherited

the score used

Definition at line 751 of file genericBNLearner.h.

◆ scoreDatabase_

Database gum::learning::genericBNLearner::scoreDatabase_
protectedinherited

the database to be used by the scores and parameter estimators

Definition at line 814 of file genericBNLearner.h.

◆ scoreType_

ScoreType gum::learning::genericBNLearner::scoreType_ {ScoreType::BDeu}
protectedinherited

the score selected for learning

Definition at line 748 of file genericBNLearner.h.

◆ selectedAlgo_

AlgoType gum::learning::genericBNLearner::selectedAlgo_ {AlgoType::GREEDY_HILL_CLIMBING}
protectedinherited

the selected learning algorithm

Definition at line 792 of file genericBNLearner.h.


The documentation for this class was generated from the following file: