aGrUM  0.16.0
gum::learning::genericBNLearner Class Reference

A pack of learning algorithms that can easily be used. More...

#include <genericBNLearner.h>

+ Inheritance diagram for gum::learning::genericBNLearner:
+ Collaboration diagram for gum::learning::genericBNLearner:

Public Attributes

Signaler3< Size, double, doubleonProgress
 Progression, error and time. More...
 
Signaler1< std::string > onStop
 Criteria messageApproximationScheme. More...
 

Public Member Functions

void __setAprioriWeight (double weight)
 sets the apriori weight More...
 
void setMandatoryArcs (const ArcSet &set)
 assign a set of forbidden arcs More...
 
Constructors / Destructors
 genericBNLearner (const std::string &filename, const std::vector< std::string > &missing_symbols)
 default constructor More...
 
 genericBNLearner (const DatabaseTable<> &db)
 default constructor More...
 
template<typename GUM_SCALAR >
 genericBNLearner (const std::string &filename, const gum::BayesNet< GUM_SCALAR > &src, const std::vector< std::string > &missing_symbols)
 read the database file for the score / parameter estimation and var names More...
 
 genericBNLearner (const genericBNLearner &)
 copy constructor More...
 
 genericBNLearner (genericBNLearner &&)
 move constructor More...
 
virtual ~genericBNLearner ()
 destructor More...
 
Operators
genericBNLearneroperator= (const genericBNLearner &)
 copy operator More...
 
genericBNLearneroperator= (genericBNLearner &&)
 move operator More...
 
Accessors / Modifiers
DAG learnDAG ()
 learn a structure from a file (must have read the db before) More...
 
MixedGraph learnMixedStructure ()
 learn a partial structure from a file (must have read the db before and must have selected miic or 3off2) More...
 
void setInitialDAG (const DAG &)
 sets an initial DAG structure More...
 
const std::vector< std::string > & names () const
 returns the names of the variables in the database More...
 
const std::vector< std::size_t > & domainSizes () const
 returns the domain sizes of the variables in the database More...
 
NodeId idFromName (const std::string &var_name) const
 returns the node id corresponding to a variable name More...
 
const DatabaseTabledatabase () const
 returns the database used by the BNLearner More...
 
void setDatabaseWeight (const double new_weight)
 assign a weight to all the rows of the learning database so that the sum of their weights is equal to new_weight More...
 
void setRecordWeight (const std::size_t i, const double weight)
 sets the weight of the ith record of the database More...
 
double recordWeight (const std::size_t i) const
 returns the weight of the ith record More...
 
double databaseWeight () const
 returns the weight of the whole database More...
 
const std::string & nameFromId (NodeId id) const
 returns the variable name corresponding to a given node id More...
 
template<template< typename > class XALLOC>
void useDatabaseRanges (const std::vector< std::pair< std::size_t, std::size_t >, XALLOC< std::pair< std::size_t, std::size_t > > > &new_ranges)
 use a new set of database rows' ranges to perform learning More...
 
void clearDatabaseRanges ()
 reset the ranges to the one range corresponding to the whole database More...
 
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges () const
 returns the current database rows' ranges used for learning More...
 
std::pair< std::size_t, std::size_t > useCrossValidationFold (const std::size_t learning_fold, const std::size_t k_fold)
 sets the ranges of rows to be used for cross-validation learning More...
 
std::pair< double, doublechi2 (const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
 Return the <statistic,pvalue> pair for chi2 test in the database. More...
 
std::pair< double, doublechi2 (const std::string &name1, const std::string &name2, const std::vector< std::string > &knowing={})
 Return the <statistic,pvalue> pair for the BNLearner. More...
 
std::pair< double, doubleG2 (const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
 Return the <statistic,pvalue> pair for for G2 test in the database. More...
 
std::pair< double, doubleG2 (const std::string &name1, const std::string &name2, const std::vector< std::string > &knowing={})
 Return the <statistic,pvalue> pair for for G2 test in the database. More...
 
double logLikelihood (const std::vector< NodeId > &vars, const std::vector< NodeId > &knowing={})
 Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner. More...
 
double logLikelihood (const std::vector< std::string > &vars, const std::vector< std::string > &knowing={})
 Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner. More...
 
Size nbCols () const
 
Size nbRows () const
 
void useEM (const double epsilon)
 use The EM algorithm to learn paramters More...
 
bool hasMissingValues () const
 returns true if the learner's database has missing values More...
 
Score selection
void useScoreAIC ()
 indicate that we wish to use an AIC score More...
 
void useScoreBD ()
 indicate that we wish to use a BD score More...
 
void useScoreBDeu ()
 indicate that we wish to use a BDeu score More...
 
void useScoreBIC ()
 indicate that we wish to use a BIC score More...
 
void useScoreK2 ()
 indicate that we wish to use a K2 score More...
 
void useScoreLog2Likelihood ()
 indicate that we wish to use a Log2Likelihood score More...
 
A priori selection / parameterization
void useNoApriori ()
 use no apriori More...
 
void useAprioriBDeu (double weight=1)
 use the BDeu apriori More...
 
void useAprioriSmoothing (double weight=1)
 use the apriori smoothing More...
 
void useAprioriDirichlet (const std::string &filename, double weight=1)
 use the Dirichlet apriori More...
 
std::string checkScoreAprioriCompatibility ()
 checks whether the current score and apriori are compatible More...
 
Learning algorithm selection
void useGreedyHillClimbing ()
 indicate that we wish to use a greedy hill climbing algorithm More...
 
void useLocalSearchWithTabuList (Size tabu_size=100, Size nb_decrease=2)
 indicate that we wish to use a local search with tabu list More...
 
void useK2 (const Sequence< NodeId > &order)
 indicate that we wish to use K2 More...
 
void useK2 (const std::vector< NodeId > &order)
 indicate that we wish to use K2 More...
 
void use3off2 ()
 indicate that we wish to use 3off2 More...
 
void useMIIC ()
 indicate that we wish to use MIIC More...
 
3off2/MIIC parameterization and specific results
void useNML ()
 indicate that we wish to use the NML correction for 3off2 More...
 
void useMDL ()
 indicate that we wish to use the MDL correction for 3off2 More...
 
void useNoCorr ()
 indicate that we wish to use the NoCorr correction for 3off2 More...
 
const std::vector< ArclatentVariables () const
 get the list of arcs hiding latent variables More...
 
Accessors / Modifiers for adding constraints on learning
void setMaxIndegree (Size max_indegree)
 sets the max indegree More...
 
void setSliceOrder (const NodeProperty< NodeId > &slice_order)
 sets a partial order on the nodes More...
 
void setSliceOrder (const std::vector< std::vector< std::string > > &slices)
 sets a partial order on the nodes More...
 
void setForbiddenArcs (const ArcSet &set)
 assign a set of forbidden arcs More...
 
assign a new forbidden arc
void addForbiddenArc (const Arc &arc)
 
void addForbiddenArc (const NodeId tail, const NodeId head)
 
void addForbiddenArc (const std::string &tail, const std::string &head)
 
void addMandatoryArc (const Arc &arc)
 
void addMandatoryArc (const NodeId tail, const NodeId head)
 
void addMandatoryArc (const std::string &tail, const std::string &head)
 
remove a forbidden arc
void eraseForbiddenArc (const Arc &arc)
 
void eraseForbiddenArc (const NodeId tail, const NodeId head)
 
void eraseForbiddenArc (const std::string &tail, const std::string &head)
 
void eraseMandatoryArc (const Arc &arc)
 
void eraseMandatoryArc (const NodeId tail, const NodeId head)
 
void eraseMandatoryArc (const std::string &tail, const std::string &head)
 
void setPossibleEdges (const EdgeSet &set)
 assign a set of forbidden edges More...
 
void setPossibleSkeleton (const UndiGraph &skeleton)
 assign a set of forbidden edges More...
 
assign a new possible edge
Warning
Once at least one possible edge is defined, all other edges are not possible anymore
void addPossibleEdge (const Edge &edge)
 
void addPossibleEdge (const NodeId tail, const NodeId head)
 
void addPossibleEdge (const std::string &tail, const std::string &head)
 
remove a possible edge
void erasePossibleEdge (const Edge &edge)
 
void erasePossibleEdge (const NodeId tail, const NodeId head)
 
void erasePossibleEdge (const std::string &tail, const std::string &head)
 
redistribute signals AND implemenation of interface
INLINE void setCurrentApproximationScheme (const ApproximationScheme *approximationScheme)
 {@ /// distribute signals More...
 
INLINE void distributeProgress (const ApproximationScheme *approximationScheme, Size pourcent, double error, double time)
 {@ /// distribute signals More...
 
INLINE void distributeStop (const ApproximationScheme *approximationScheme, std::string message)
 distribute signals More...
 
void setEpsilon (double eps)
 Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)| If the criterion was disabled it will be enabled. More...
 
double epsilon () const
 Get the value of epsilon. More...
 
void disableEpsilon ()
 Disable stopping criterion on epsilon. More...
 
void enableEpsilon ()
 Enable stopping criterion on epsilon. More...
 
bool isEnabledEpsilon () const
 
void setMinEpsilonRate (double rate)
 Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|) If the criterion was disabled it will be enabled. More...
 
double minEpsilonRate () const
 Get the value of the minimal epsilon rate. More...
 
void disableMinEpsilonRate ()
 Disable stopping criterion on epsilon rate. More...
 
void enableMinEpsilonRate ()
 Enable stopping criterion on epsilon rate. More...
 
bool isEnabledMinEpsilonRate () const
 
void setMaxIter (Size max)
 stopping criterion on number of iterationsIf the criterion was disabled it will be enabled More...
 
Size maxIter () const
 
void disableMaxIter ()
 Disable stopping criterion on max iterations. More...
 
void enableMaxIter ()
 Enable stopping criterion on max iterations. More...
 
bool isEnabledMaxIter () const
 
void setMaxTime (double timeout)
 stopping criterion on timeout If the criterion was disabled it will be enabled More...
 
double maxTime () const
 returns the timeout (in seconds) More...
 
double currentTime () const
 get the current running time in second (double) More...
 
void disableMaxTime ()
 Disable stopping criterion on timeout. More...
 
void enableMaxTime ()
 stopping criterion on timeout If the criterion was disabled it will be enabled More...
 
bool isEnabledMaxTime () const
 
void setPeriodSize (Size p)
 how many samples between 2 stopping isEnableds More...
 
Size periodSize () const
 how many samples between 2 stopping isEnableds More...
 
void setVerbosity (bool v)
 verbosity More...
 
bool verbosity () const
 verbosity More...
 
ApproximationSchemeSTATE stateApproximationScheme () const
 history More...
 
Size nbrIterations () const
 
const std::vector< double > & history () const
 
Getters and setters
std::string messageApproximationScheme () const
 Returns the approximation scheme message. More...
 

Public Types

enum  ScoreType {
  ScoreType::AIC, ScoreType::BD, ScoreType::BDeu, ScoreType::BIC,
  ScoreType::K2, ScoreType::LOG2LIKELIHOOD
}
 an enumeration enabling to select easily the score we wish to use More...
 
enum  ParamEstimatorType { ParamEstimatorType::ML }
 an enumeration to select the type of parameter estimation we shall apply More...
 
enum  AprioriType { AprioriType::NO_APRIORI, AprioriType::SMOOTHING, AprioriType::DIRICHLET_FROM_DATABASE, AprioriType::BDEU }
 an enumeration to select the apriori More...
 
enum  AlgoType { AlgoType::K2, AlgoType::GREEDY_HILL_CLIMBING, AlgoType::LOCAL_SEARCH_WITH_TABU_LIST, AlgoType::MIIC_THREE_OFF_TWO }
 an enumeration to select easily the learning algorithm to use More...
 
enum  ApproximationSchemeSTATE : char {
  ApproximationSchemeSTATE::Undefined, ApproximationSchemeSTATE::Continue, ApproximationSchemeSTATE::Epsilon, ApproximationSchemeSTATE::Rate,
  ApproximationSchemeSTATE::Limit, ApproximationSchemeSTATE::TimeLimit, ApproximationSchemeSTATE::Stopped
}
 The different state of an approximation scheme. More...
 

Protected Attributes

ScoreType __score_type {ScoreType::BDeu}
 the score selected for learning More...
 
Score__score {nullptr}
 the score used More...
 
ParamEstimatorType __param_estimator_type {ParamEstimatorType::ML}
 the type of the parameter estimator More...
 
double __EMepsilon {0.0}
 epsilon for EM. if espilon=0.0 : no EM More...
 
CorrectedMutualInformation__mutual_info {nullptr}
 the selected correction for 3off2 and miic More...
 
AprioriType __apriori_type {AprioriType::NO_APRIORI}
 the a priori selected for the score and parameters More...
 
Apriori__apriori {nullptr}
 the apriori used More...
 
AprioriNoApriori__no_apriori {nullptr}
 
double __apriori_weight {1.0f}
 the weight of the apriori More...
 
StructuralConstraintSliceOrder __constraint_SliceOrder
 the constraint for 2TBNs More...
 
StructuralConstraintIndegree __constraint_Indegree
 the constraint for indegrees More...
 
StructuralConstraintTabuList __constraint_TabuList
 the constraint for tabu lists More...
 
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
 the constraint on forbidden arcs More...
 
StructuralConstraintPossibleEdges __constraint_PossibleEdges
 the constraint on possible Edges More...
 
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
 the constraint on forbidden arcs More...
 
AlgoType __selected_algo {AlgoType::GREEDY_HILL_CLIMBING}
 the selected learning algorithm More...
 
K2 __K2
 the K2 algorithm More...
 
Miic __miic_3off2
 the 3off2 algorithm More...
 
CorrectedMutualInformation ::KModeTypes __3off2_kmode
 the penalty used in 3off2 More...
 
DAG2BNLearner __Dag2BN
 the parametric EM More...
 
GreedyHillClimbing __greedy_hill_climbing
 the greedy hill climbing algorithm More...
 
LocalSearchWithTabuList __local_search_with_tabu_list
 the local search with tabu list algorithm More...
 
Database __score_database
 the database to be used by the scores and parameter estimators More...
 
std::vector< std::pair< std::size_t, std::size_t > > __ranges
 the set of rows' ranges within the database in which learning is done More...
 
Database__apriori_database {nullptr}
 the database used by the Dirichlet a priori More...
 
std::string __apriori_dbname
 the filename for the Dirichlet a priori, if any More...
 
DAG __initial_dag
 an initial DAG given to learners More...
 
const ApproximationScheme__current_algorithm {nullptr}
 

Protected Member Functions

void __createApriori ()
 create the apriori used for learning More...
 
void __createScore ()
 create the score used for learning More...
 
ParamEstimator__createParamEstimator (DBRowGeneratorParser<> &parser, bool take_into_account_score=true)
 create the parameter estimator used for learning More...
 
DAG __learnDAG ()
 returns the DAG learnt More...
 
MixedGraph __prepare_miic_3off2 ()
 prepares the initial graph for 3off2 or miic More...
 
const std::string & __getAprioriType () const
 returns the type (as a string) of a given apriori More...
 
void __createCorrectedMutualInformation ()
 create the Corrected Mutual Information instance for Miic/3off2 More...
 

Static Protected Member Functions

static DatabaseTable __readFile (const std::string &filename, const std::vector< std::string > &missing_symbols)
 reads a file and returns a databaseVectInRam More...
 
static void __checkFileName (const std::string &filename)
 checks whether the extension of a CSV filename is correct More...
 

Classes

class  Database
 a helper to easily read databases More...
 

Detailed Description

A pack of learning algorithms that can easily be used.

The pack currently contains K2, GreedyHillClimbing and LocalSearchWithTabuList also 3off2/miic

Definition at line 107 of file genericBNLearner.h.

Member Enumeration Documentation

◆ AlgoType

an enumeration to select easily the learning algorithm to use

Enumerator
K2 
GREEDY_HILL_CLIMBING 
LOCAL_SEARCH_WITH_TABU_LIST 
MIIC_THREE_OFF_TWO 

Definition at line 126 of file genericBNLearner.h.

126  {
127  K2,
128  GREEDY_HILL_CLIMBING,
129  LOCAL_SEARCH_WITH_TABU_LIST,
130  MIIC_THREE_OFF_TWO
131  };

◆ ApproximationSchemeSTATE

The different state of an approximation scheme.

Enumerator
Undefined 
Continue 
Epsilon 
Rate 
Limit 
TimeLimit 
Stopped 

Definition at line 65 of file IApproximationSchemeConfiguration.h.

65  : char {
66  Undefined,
67  Continue,
68  Epsilon,
69  Rate,
70  Limit,
71  TimeLimit,
72  Stopped
73  };

◆ AprioriType

an enumeration to select the apriori

Enumerator
NO_APRIORI 
SMOOTHING 
DIRICHLET_FROM_DATABASE 
BDEU 

Definition at line 118 of file genericBNLearner.h.

118  {
119  NO_APRIORI,
120  SMOOTHING,
121  DIRICHLET_FROM_DATABASE,
122  BDEU
123  };

◆ ParamEstimatorType

an enumeration to select the type of parameter estimation we shall apply

Enumerator
ML 

Definition at line 115 of file genericBNLearner.h.

115 { ML };

◆ ScoreType

an enumeration enabling to select easily the score we wish to use

Enumerator
AIC 
BD 
BDeu 
BIC 
K2 
LOG2LIKELIHOOD 

Definition at line 111 of file genericBNLearner.h.

111 { AIC, BD, BDeu, BIC, K2, LOG2LIKELIHOOD };

Constructor & Destructor Documentation

◆ genericBNLearner() [1/5]

gum::learning::genericBNLearner::genericBNLearner ( const std::string &  filename,
const std::vector< std::string > &  missing_symbols 
)

default constructor

read the database file for the score / parameter estimation and var names

Definition at line 191 of file genericBNLearner.cpp.

References __no_apriori, __score_database, and gum::learning::genericBNLearner::Database::databaseTable().

193  :
194  __score_database(filename, missing_symbols) {
195  __no_apriori = new AprioriNoApriori<>(__score_database.databaseTable());
196 
197  // for debugging purposes
198  GUM_CONSTRUCTOR(genericBNLearner);
199  }
Database __score_database
the database to be used by the scores and parameter estimators
genericBNLearner(const std::string &filename, const std::vector< std::string > &missing_symbols)
default constructor
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:

◆ genericBNLearner() [2/5]

gum::learning::genericBNLearner::genericBNLearner ( const DatabaseTable<> &  db)

default constructor

read the database file for the score / parameter estimation and var names

Definition at line 202 of file genericBNLearner.cpp.

References __no_apriori, __score_database, and gum::learning::genericBNLearner::Database::databaseTable().

202  :
203  __score_database(db) {
204  __no_apriori = new AprioriNoApriori<>(__score_database.databaseTable());
205 
206  // for debugging purposes
207  GUM_CONSTRUCTOR(genericBNLearner);
208  }
Database __score_database
the database to be used by the scores and parameter estimators
genericBNLearner(const std::string &filename, const std::vector< std::string > &missing_symbols)
default constructor
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:

◆ genericBNLearner() [3/5]

template<typename GUM_SCALAR >
gum::learning::genericBNLearner::genericBNLearner ( const std::string &  filename,
const gum::BayesNet< GUM_SCALAR > &  src,
const std::vector< std::string > &  missing_symbols 
)

read the database file for the score / parameter estimation and var names

Parameters
filenameThe file to learn from.
modalitiesindicate for some nodes (not necessarily all the nodes of the BN) which modalities they should have and in which order these modalities should be stored into the nodes. For instance, if modalities = { 1 -> {True, False, Big} }, then the node of id 1 in the BN will have 3 modalities, the first one being True, the second one being False, and the third bein Big.
parse_databaseif true, the modalities specified by the user will be considered as a superset of the modalities of the variables. A parsing of the database will allow to determine which ones are really necessary and will keep them in the order specified by the user (NodeProperty modalities). If parse_database is set to false (the default), then the modalities specified by the user will be considered as being exactly those of the variables of the BN (as a consequence, if we find other values in the database, an exception will be raised during learning).

Definition at line 89 of file genericBNLearner_tpl.h.

References __no_apriori, __score_database, and gum::learning::genericBNLearner::Database::databaseTable().

92  :
93  __score_database(filename, bn, missing_symbols) {
94  __no_apriori = new AprioriNoApriori<>(__score_database.databaseTable());
95  GUM_CONSTRUCTOR(genericBNLearner);
96  }
Database __score_database
the database to be used by the scores and parameter estimators
genericBNLearner(const std::string &filename, const std::vector< std::string > &missing_symbols)
default constructor
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:

◆ genericBNLearner() [4/5]

gum::learning::genericBNLearner::genericBNLearner ( const genericBNLearner from)

copy constructor

Definition at line 211 of file genericBNLearner.cpp.

References __no_apriori, __score_database, and gum::learning::genericBNLearner::Database::databaseTable().

211  :
212  __score_type(from.__score_type),
213  __param_estimator_type(from.__param_estimator_type),
214  __EMepsilon(from.__EMepsilon), __apriori_type(from.__apriori_type),
215  __apriori_weight(from.__apriori_weight),
216  __constraint_SliceOrder(from.__constraint_SliceOrder),
217  __constraint_Indegree(from.__constraint_Indegree),
218  __constraint_TabuList(from.__constraint_TabuList),
219  __constraint_ForbiddenArcs(from.__constraint_ForbiddenArcs),
220  __constraint_MandatoryArcs(from.__constraint_MandatoryArcs),
221  __selected_algo(from.__selected_algo), __K2(from.__K2),
222  __miic_3off2(from.__miic_3off2), __3off2_kmode(from.__3off2_kmode),
223  __greedy_hill_climbing(from.__greedy_hill_climbing),
224  __local_search_with_tabu_list(from.__local_search_with_tabu_list),
225  __score_database(from.__score_database), __ranges(from.__ranges),
226  __apriori_dbname(from.__apriori_dbname),
227  __initial_dag(from.__initial_dag) {
228  __no_apriori = new AprioriNoApriori<>(__score_database.databaseTable());
229 
230  // for debugging purposes
231  GUM_CONS_CPY(genericBNLearner);
232  }
AlgoType __selected_algo
the selected learning algorithm
Database __score_database
the database to be used by the scores and parameter estimators
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
double __EMepsilon
epsilon for EM. if espilon=0.0 : no EM
StructuralConstraintSliceOrder __constraint_SliceOrder
the constraint for 2TBNs
Miic __miic_3off2
the 3off2 algorithm
ParamEstimatorType __param_estimator_type
the type of the parameter estimator
AprioriType __apriori_type
the a priori selected for the score and parameters
DAG __initial_dag
an initial DAG given to learners
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
genericBNLearner(const std::string &filename, const std::vector< std::string > &missing_symbols)
default constructor
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
StructuralConstraintTabuList __constraint_TabuList
the constraint for tabu lists
std::string __apriori_dbname
the filename for the Dirichlet a priori, if any
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
const DatabaseTable & databaseTable() const
returns the internal database table
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
StructuralConstraintIndegree __constraint_Indegree
the constraint for indegrees
ScoreType __score_type
the score selected for learning
double __apriori_weight
the weight of the apriori
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ genericBNLearner() [5/5]

gum::learning::genericBNLearner::genericBNLearner ( genericBNLearner &&  from)

move constructor

Definition at line 234 of file genericBNLearner.cpp.

References __no_apriori, __score_database, and gum::learning::genericBNLearner::Database::databaseTable().

234  :
235  __score_type(from.__score_type),
236  __param_estimator_type(from.__param_estimator_type),
237  __EMepsilon(from.__EMepsilon), __apriori_type(from.__apriori_type),
238  __apriori_weight(from.__apriori_weight),
239  __constraint_SliceOrder(std::move(from.__constraint_SliceOrder)),
240  __constraint_Indegree(std::move(from.__constraint_Indegree)),
241  __constraint_TabuList(std::move(from.__constraint_TabuList)),
242  __constraint_ForbiddenArcs(std::move(from.__constraint_ForbiddenArcs)),
243  __constraint_MandatoryArcs(std::move(from.__constraint_MandatoryArcs)),
244  __selected_algo(from.__selected_algo), __K2(std::move(from.__K2)),
245  __miic_3off2(std::move(from.__miic_3off2)),
246  __3off2_kmode(from.__3off2_kmode),
247  __greedy_hill_climbing(std::move(from.__greedy_hill_climbing)),
249  std::move(from.__local_search_with_tabu_list)),
250  __score_database(std::move(from.__score_database)),
251  __ranges(std::move(from.__ranges)),
252  __apriori_dbname(std::move(from.__apriori_dbname)),
253  __initial_dag(std::move(from.__initial_dag)) {
254  __no_apriori = new AprioriNoApriori<>(__score_database.databaseTable());
255 
256  // for debugging purposes
257  GUM_CONS_MOV(genericBNLearner);
258  }
AlgoType __selected_algo
the selected learning algorithm
Database __score_database
the database to be used by the scores and parameter estimators
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
double __EMepsilon
epsilon for EM. if espilon=0.0 : no EM
StructuralConstraintSliceOrder __constraint_SliceOrder
the constraint for 2TBNs
Miic __miic_3off2
the 3off2 algorithm
ParamEstimatorType __param_estimator_type
the type of the parameter estimator
AprioriType __apriori_type
the a priori selected for the score and parameters
DAG __initial_dag
an initial DAG given to learners
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
genericBNLearner(const std::string &filename, const std::vector< std::string > &missing_symbols)
default constructor
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
StructuralConstraintTabuList __constraint_TabuList
the constraint for tabu lists
std::string __apriori_dbname
the filename for the Dirichlet a priori, if any
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
const DatabaseTable & databaseTable() const
returns the internal database table
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
StructuralConstraintIndegree __constraint_Indegree
the constraint for indegrees
ScoreType __score_type
the score selected for learning
double __apriori_weight
the weight of the apriori
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ ~genericBNLearner()

gum::learning::genericBNLearner::~genericBNLearner ( )
virtual

destructor

Definition at line 260 of file genericBNLearner.cpp.

References __apriori, __apriori_database, __mutual_info, __no_apriori, and __score.

260  {
261  if (__score) delete __score;
262 
263  if (__apriori) delete __apriori;
264 
265  if (__no_apriori) delete __no_apriori;
266 
268 
269  if (__mutual_info) delete __mutual_info;
270 
271  GUM_DESTRUCTOR(genericBNLearner);
272  }
Score * __score
the score used
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
genericBNLearner(const std::string &filename, const std::vector< std::string > &missing_symbols)
default constructor
Database * __apriori_database
the database used by the Dirichlet a priori
Apriori * __apriori
the apriori used

Member Function Documentation

◆ __checkFileName()

void gum::learning::genericBNLearner::__checkFileName ( const std::string &  filename)
staticprotected

checks whether the extension of a CSV filename is correct

Definition at line 411 of file genericBNLearner.cpp.

References GUM_ERROR.

Referenced by __readFile(), and gum::learning::genericBNLearner::Database::Database().

411  {
412  // get the extension of the file
413  Size filename_size = Size(filename.size());
414 
415  if (filename_size < 4) {
416  GUM_ERROR(FormatNotFound,
417  "genericBNLearner could not determine the "
418  "file type of the database");
419  }
420 
421  std::string extension = filename.substr(filename.size() - 4);
422  std::transform(
423  extension.begin(), extension.end(), extension.begin(), ::tolower);
424 
425  if (extension != ".csv") {
426  GUM_ERROR(
427  OperationNotAllowed,
428  "genericBNLearner does not support yet this type of database file");
429  }
430  }
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:48
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the caller graph for this function:

◆ __createApriori()

void gum::learning::genericBNLearner::__createApriori ( )
protected

create the apriori used for learning

Definition at line 460 of file genericBNLearner.cpp.

References __apriori, __apriori_database, __apriori_dbname, __apriori_type, __apriori_weight, __score_database, BDEU, gum::learning::genericBNLearner::Database::databaseTable(), DIRICHLET_FROM_DATABASE, GUM_ERROR, gum::learning::genericBNLearner::Database::missingSymbols(), NO_APRIORI, gum::learning::genericBNLearner::Database::nodeId2Columns(), gum::learning::genericBNLearner::Database::parser(), gum::learning::Apriori< ALLOC >::setWeight(), and SMOOTHING.

Referenced by chi2(), G2(), learnDAG(), and logLikelihood().

460  {
461  // first, save the old apriori, to be delete if everything is ok
462  Apriori<>* old_apriori = __apriori;
463 
464  // create the new apriori
465  switch (__apriori_type) {
467  __apriori = new AprioriNoApriori<>(__score_database.databaseTable(),
469  break;
470 
472  __apriori = new AprioriSmoothing<>(__score_database.databaseTable(),
474  break;
475 
477  if (__apriori_database != nullptr) {
478  delete __apriori_database;
479  __apriori_database = nullptr;
480  }
481 
482  __apriori_database = new Database(__apriori_dbname,
485 
486  __apriori = new AprioriDirichletFromDatabase<>(
490  break;
491 
492  case AprioriType::BDEU:
493  __apriori = new AprioriBDeu<>(__score_database.databaseTable(),
495  break;
496 
497  default:
498  GUM_ERROR(OperationNotAllowed,
499  "The BNLearner does not support yet this apriori");
500  }
501 
502  // do not forget to assign a weight to the apriori
503  __apriori->setWeight(__apriori_weight);
504 
505  // remove the old apriori, if any
506  if (old_apriori != nullptr) delete old_apriori;
507  }
Database __score_database
the database to be used by the scores and parameter estimators
const std::vector< std::string > & missingSymbols() const
returns the set of missing symbols taken into account
AprioriType __apriori_type
the a priori selected for the score and parameters
Database * __apriori_database
the database used by the Dirichlet a priori
Apriori * __apriori
the apriori used
std::string __apriori_dbname
the filename for the Dirichlet a priori, if any
const DatabaseTable & databaseTable() const
returns the internal database table
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
DBRowGeneratorParser & parser()
returns the parser for the database
double __apriori_weight
the weight of the apriori
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ __createCorrectedMutualInformation()

void gum::learning::genericBNLearner::__createCorrectedMutualInformation ( )
protected

create the Corrected Mutual Information instance for Miic/3off2

Definition at line 661 of file genericBNLearner.cpp.

References __3off2_kmode, __mutual_info, __no_apriori, __ranges, __score_database, GUM_ERROR, gum::learning::genericBNLearner::Database::nodeId2Columns(), gum::learning::genericBNLearner::Database::parser(), gum::learning::CorrectedMutualInformation< ALLOC >::useMDL(), gum::learning::CorrectedMutualInformation< ALLOC >::useNML(), and gum::learning::CorrectedMutualInformation< ALLOC >::useNoCorr().

Referenced by __prepare_miic_3off2().

661  {
662  if (__mutual_info != nullptr) delete __mutual_info;
663 
664  __mutual_info =
665  new CorrectedMutualInformation<>(__score_database.parser(),
666  *__no_apriori,
667  __ranges,
669  switch (__3off2_kmode) {
672  break;
673 
676  break;
677 
680  break;
681 
682  default:
683  GUM_ERROR(NotImplementedYet,
684  "The BNLearner's corrected mutual information class does "
685  << "not support yet penalty mode " << int(__3off2_kmode));
686  }
687  }
void useNML()
use the kNML penalty function
Database __score_database
the database to be used by the scores and parameter estimators
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
DBRowGeneratorParser & parser()
returns the parser for the database
void useMDL()
use the MDL penalty function
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
void useNoCorr()
use no correction/penalty function
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ __createParamEstimator()

ParamEstimator * gum::learning::genericBNLearner::__createParamEstimator ( DBRowGeneratorParser<> &  parser,
bool  take_into_account_score = true 
)
protected

create the parameter estimator used for learning

Definition at line 567 of file genericBNLearner.cpp.

References __apriori, __no_apriori, __param_estimator_type, __ranges, __score, __score_database, GUM_ERROR, gum::learning::Score< ALLOC >::internalApriori(), ML, gum::learning::genericBNLearner::Database::nodeId2Columns(), and gum::learning::ParamEstimator< ALLOC >::setRanges().

568  {
569  ParamEstimator<>* param_estimator = nullptr;
570 
571  // create the new estimator
572  switch (__param_estimator_type) {
574  if (take_into_account_score && (__score != nullptr)) {
575  param_estimator =
576  new ParamEstimatorML<>(parser,
577  *__apriori,
579  __ranges,
581  } else {
582  param_estimator =
583  new ParamEstimatorML<>(parser,
584  *__apriori,
585  *__no_apriori,
586  __ranges,
588  }
589 
590  break;
591 
592  default:
593  GUM_ERROR(OperationNotAllowed,
594  "genericBNLearner does not support "
595  << "yet this parameter estimator");
596  }
597 
598  // assign the set of ranges
599  param_estimator->setRanges(__ranges);
600 
601  return param_estimator;
602  }
Score * __score
the score used
Database __score_database
the database to be used by the scores and parameter estimators
ParamEstimatorType __param_estimator_type
the type of the parameter estimator
Apriori * __apriori
the apriori used
virtual const Apriori< ALLOC > & internalApriori() const =0
returns the internal apriori of the score
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ __createScore()

void gum::learning::genericBNLearner::__createScore ( )
protected

create the score used for learning

Definition at line 509 of file genericBNLearner.cpp.

References __apriori, __ranges, __score, __score_database, __score_type, AIC, BD, BDeu, BIC, GUM_ERROR, K2, LOG2LIKELIHOOD, gum::learning::genericBNLearner::Database::nodeId2Columns(), and gum::learning::genericBNLearner::Database::parser().

Referenced by learnDAG().

509  {
510  // first, save the old score, to be delete if everything is ok
511  Score<>* old_score = __score;
512 
513  // create the new scoring function
514  switch (__score_type) {
515  case ScoreType::AIC:
516  __score = new ScoreAIC<>(__score_database.parser(),
517  *__apriori,
518  __ranges,
520  break;
521 
522  case ScoreType::BD:
523  __score = new ScoreBD<>(__score_database.parser(),
524  *__apriori,
525  __ranges,
527  break;
528 
529  case ScoreType::BDeu:
530  __score = new ScoreBDeu<>(__score_database.parser(),
531  *__apriori,
532  __ranges,
534  break;
535 
536  case ScoreType::BIC:
537  __score = new ScoreBIC<>(__score_database.parser(),
538  *__apriori,
539  __ranges,
541  break;
542 
543  case ScoreType::K2:
544  __score = new ScoreK2<>(__score_database.parser(),
545  *__apriori,
546  __ranges,
548  break;
549 
551  __score = new ScoreLog2Likelihood<>(__score_database.parser(),
552  *__apriori,
553  __ranges,
555  break;
556 
557  default:
558  GUM_ERROR(OperationNotAllowed,
559  "genericBNLearner does not support yet this score");
560  }
561 
562  // remove the old score, if any
563  if (old_score != nullptr) delete old_score;
564  }
Score * __score
the score used
Database __score_database
the database to be used by the scores and parameter estimators
Apriori * __apriori
the apriori used
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
ScoreType __score_type
the score selected for learning
DBRowGeneratorParser & parser()
returns the parser for the database
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ __getAprioriType()

INLINE const std::string & gum::learning::genericBNLearner::__getAprioriType ( ) const
protected

returns the type (as a string) of a given apriori

Definition at line 507 of file genericBNLearner_inl.h.

References __apriori_type, BDEU, DIRICHLET_FROM_DATABASE, GUM_ERROR, NO_APRIORI, and SMOOTHING.

Referenced by checkScoreAprioriCompatibility().

507  {
508  switch (__apriori_type) {
510 
512 
515 
517 
518  default:
519  GUM_ERROR(OperationNotAllowed,
520  "genericBNLearner getAprioriType does "
521  "not support yet this apriori");
522  }
523  }
static const std::string type
Definition: aprioriTypes.h:43
static const std::string type
Definition: aprioriTypes.h:48
static const std::string type
Definition: aprioriTypes.h:53
AprioriType __apriori_type
the a priori selected for the score and parameters
static const std::string type
Definition: aprioriTypes.h:38
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the caller graph for this function:

◆ __learnDAG()

DAG gum::learning::genericBNLearner::__learnDAG ( )
protected

returns the DAG learnt

Definition at line 689 of file genericBNLearner.cpp.

References __apriori_database, __apriori_type, __constraint_ForbiddenArcs, __constraint_Indegree, __constraint_MandatoryArcs, __constraint_PossibleEdges, __constraint_SliceOrder, __constraint_TabuList, __greedy_hill_climbing, __initial_dag, __K2, __local_search_with_tabu_list, __miic_3off2, __mutual_info, __prepare_miic_3off2(), __score, __score_database, __selected_algo, gum::DAG::addArc(), gum::NodeGraphPart::addNodeWithId(), gum::learning::K2::approximationScheme(), gum::learning::StructuralConstraintForbiddenArcs::arcs(), gum::learning::StructuralConstraintMandatoryArcs::arcs(), gum::learning::genericBNLearner::Database::databaseTable(), DIRICHLET_FROM_DATABASE, gum::ArcGraphPart::eraseArc(), gum::NodeGraphPart::exists(), GREEDY_HILL_CLIMBING, GUM_ERROR, gum::learning::IDatabaseTable< T_DATA, ALLOC >::hasMissingValues(), K2, gum::learning::K2::learnStructure(), gum::learning::GreedyHillClimbing::learnStructure(), gum::learning::LocalSearchWithTabuList::learnStructure(), gum::learning::Miic::learnStructure(), LOCAL_SEARCH_WITH_TABU_LIST, MIIC_THREE_OFF_TWO, gum::learning::K2::order(), and gum::SequenceImplementation< Key, Alloc, Gen >::pos().

Referenced by learnDAG().

689  {
690  // check that the database does not contain any missing value
692  || ((__apriori_database != nullptr)
695  GUM_ERROR(MissingValueInDatabase,
696  "For the moment, the BNLearner is unable to cope "
697  "with missing values in databases");
698  }
699  // add the mandatory arcs to the initial dag and remove the forbidden ones
700  // from the initial graph
701  DAG init_graph = __initial_dag;
702 
703  const ArcSet& mandatory_arcs = __constraint_MandatoryArcs.arcs();
704 
705  for (const auto& arc : mandatory_arcs) {
706  if (!init_graph.exists(arc.tail())) init_graph.addNodeWithId(arc.tail());
707 
708  if (!init_graph.exists(arc.head())) init_graph.addNodeWithId(arc.head());
709 
710  init_graph.addArc(arc.tail(), arc.head());
711  }
712 
713  const ArcSet& forbidden_arcs = __constraint_ForbiddenArcs.arcs();
714 
715  for (const auto& arc : forbidden_arcs) {
716  init_graph.eraseArc(arc);
717  }
718 
719  switch (__selected_algo) {
720  // ========================================================================
722  BNLearnerListener listener(this, __miic_3off2);
723  // create the mixedGraph and the corrected mutual information
724  MixedGraph mgraph = this->__prepare_miic_3off2();
725 
726  return __miic_3off2.learnStructure(*__mutual_info, mgraph);
727  }
728 
729  // ========================================================================
731  BNLearnerListener listener(this, __greedy_hill_climbing);
732  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
733  StructuralConstraintForbiddenArcs,
734  StructuralConstraintPossibleEdges,
735  StructuralConstraintSliceOrder >
736  gen_constraint;
737  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint) =
739  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint) =
741  static_cast< StructuralConstraintPossibleEdges& >(gen_constraint) =
743  static_cast< StructuralConstraintSliceOrder& >(gen_constraint) =
745 
746  GraphChangesGenerator4DiGraph< decltype(gen_constraint) > op_set(
747  gen_constraint);
748 
749  StructuralConstraintSetStatic< StructuralConstraintIndegree,
750  StructuralConstraintDAG >
751  sel_constraint;
752  static_cast< StructuralConstraintIndegree& >(sel_constraint) =
754 
755  GraphChangesSelector4DiGraph< decltype(sel_constraint),
756  decltype(op_set) >
757  selector(*__score, sel_constraint, op_set);
758 
759  return __greedy_hill_climbing.learnStructure(selector, init_graph);
760  }
761 
762  // ========================================================================
764  BNLearnerListener listener(this, __local_search_with_tabu_list);
765  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
766  StructuralConstraintForbiddenArcs,
767  StructuralConstraintPossibleEdges,
768  StructuralConstraintSliceOrder >
769  gen_constraint;
770  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint) =
772  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint) =
774  static_cast< StructuralConstraintPossibleEdges& >(gen_constraint) =
776  static_cast< StructuralConstraintSliceOrder& >(gen_constraint) =
778 
779  GraphChangesGenerator4DiGraph< decltype(gen_constraint) > op_set(
780  gen_constraint);
781 
782  StructuralConstraintSetStatic< StructuralConstraintTabuList,
783  StructuralConstraintIndegree,
784  StructuralConstraintDAG >
785  sel_constraint;
786  static_cast< StructuralConstraintTabuList& >(sel_constraint) =
788  static_cast< StructuralConstraintIndegree& >(sel_constraint) =
790 
791  GraphChangesSelector4DiGraph< decltype(sel_constraint),
792  decltype(op_set) >
793  selector(*__score, sel_constraint, op_set);
794 
796  init_graph);
797  }
798 
799  // ========================================================================
800  case AlgoType::K2: {
801  BNLearnerListener listener(this, __K2.approximationScheme());
802  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
803  StructuralConstraintForbiddenArcs,
804  StructuralConstraintPossibleEdges >
805  gen_constraint;
806  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint) =
808  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint) =
810  static_cast< StructuralConstraintPossibleEdges& >(gen_constraint) =
812 
813  GraphChangesGenerator4K2< decltype(gen_constraint) > op_set(
814  gen_constraint);
815 
816  // if some mandatory arcs are incompatible with the order, use a DAG
817  // constraint instead of a DiGraph constraint to avoid cycles
818  const ArcSet& mandatory_arcs =
819  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint)
820  .arcs();
821  const Sequence< NodeId >& order = __K2.order();
822  bool order_compatible = true;
823 
824  for (const auto& arc : mandatory_arcs) {
825  if (order.pos(arc.tail()) >= order.pos(arc.head())) {
826  order_compatible = false;
827  break;
828  }
829  }
830 
831  if (order_compatible) {
832  StructuralConstraintSetStatic< StructuralConstraintIndegree,
833  StructuralConstraintDiGraph >
834  sel_constraint;
835  static_cast< StructuralConstraintIndegree& >(sel_constraint) =
837 
838  GraphChangesSelector4DiGraph< decltype(sel_constraint),
839  decltype(op_set) >
840  selector(*__score, sel_constraint, op_set);
841 
842  return __K2.learnStructure(selector, init_graph);
843  } else {
844  StructuralConstraintSetStatic< StructuralConstraintIndegree,
845  StructuralConstraintDAG >
846  sel_constraint;
847  static_cast< StructuralConstraintIndegree& >(sel_constraint) =
849 
850  GraphChangesSelector4DiGraph< decltype(sel_constraint),
851  decltype(op_set) >
852  selector(*__score, sel_constraint, op_set);
853 
854  return __K2.learnStructure(selector, init_graph);
855  }
856  }
857 
858  // ========================================================================
859  default:
860  GUM_ERROR(OperationNotAllowed,
861  "the learnDAG method has not been implemented for this "
862  "learning algorithm");
863  }
864  }
AlgoType __selected_algo
the selected learning algorithm
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
Score * __score
the score used
Database __score_database
the database to be used by the scores and parameter estimators
Idx pos(const Key &key) const
Returns the position of the object passed in argument (if it exists).
Definition: sequence_tpl.h:518
const ArcSet & arcs() const
returns the set of mandatory arcs
StructuralConstraintSliceOrder __constraint_SliceOrder
the constraint for 2TBNs
Miic __miic_3off2
the 3off2 algorithm
Set< Arc > ArcSet
Some typdefs and define for shortcuts ...
MixedGraph __prepare_miic_3off2()
prepares the initial graph for 3off2 or miic
AprioriType __apriori_type
the a priori selected for the score and parameters
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
DAG __initial_dag
an initial DAG given to learners
const Sequence< NodeId > & order() const noexcept
returns the current order
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
Database * __apriori_database
the database used by the Dirichlet a priori
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
bool hasMissingValues() const
indicates whether the database contains some missing values
const ArcSet & arcs() const
returns the set of mandatory arcs
StructuralConstraintTabuList __constraint_TabuList
the constraint for tabu lists
StructuralConstraintPossibleEdges __constraint_PossibleEdges
the constraint on possible Edges
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
const DatabaseTable & databaseTable() const
returns the internal database table
StructuralConstraintIndegree __constraint_Indegree
the constraint for indegrees
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
Definition: K2_tpl.h:41
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
DAG learnStructure(CorrectedMutualInformation<> &I, MixedGraph graph)
learns the structure of an Bayesian network, ie a DAG, by first learning an Essential graph and then ...
Definition: Miic.cpp:987
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ __prepare_miic_3off2()

MixedGraph gum::learning::genericBNLearner::__prepare_miic_3off2 ( )
protected

prepares the initial graph for 3off2 or miic

Definition at line 605 of file genericBNLearner.cpp.

References __constraint_ForbiddenArcs, __constraint_MandatoryArcs, __createCorrectedMutualInformation(), __miic_3off2, __score_database, gum::learning::Miic::addConstraints(), gum::UndiGraph::addEdge(), gum::NodeGraphPart::addNodeWithId(), gum::learning::StructuralConstraintMandatoryArcs::arcs(), gum::learning::StructuralConstraintForbiddenArcs::arcs(), gum::learning::genericBNLearner::Database::databaseTable(), gum::HashTable< Key, Val, Alloc >::insert(), and gum::learning::IDatabaseTable< T_DATA, ALLOC >::nbVariables().

Referenced by __learnDAG(), and learnMixedStructure().

605  {
606  // Initialize the mixed graph to the fully connected graph
607  MixedGraph mgraph;
608  for (Size i = 0; i < __score_database.databaseTable().nbVariables(); ++i) {
609  mgraph.addNodeWithId(i);
610  for (Size j = 0; j < i; ++j) {
611  mgraph.addEdge(j, i);
612  }
613  }
614 
615  // translating the constraints for 3off2 or miic
616  HashTable< std::pair< NodeId, NodeId >, char > initial_marks;
617  const ArcSet& mandatory_arcs = __constraint_MandatoryArcs.arcs();
618  for (const auto& arc : mandatory_arcs) {
619  initial_marks.insert({arc.tail(), arc.head()}, '>');
620  }
621 
622  const ArcSet& forbidden_arcs = __constraint_ForbiddenArcs.arcs();
623  for (const auto& arc : forbidden_arcs) {
624  initial_marks.insert({arc.tail(), arc.head()}, '-');
625  }
626  __miic_3off2.addConstraints(initial_marks);
627 
628  // create the mutual entropy object
629  // if (__mutual_info == nullptr) { this->useNML(); }
631 
632  return mgraph;
633  }
Database __score_database
the database to be used by the scores and parameter estimators
const ArcSet & arcs() const
returns the set of mandatory arcs
Miic __miic_3off2
the 3off2 algorithm
void addConstraints(HashTable< std::pair< NodeId, NodeId >, char > constraints)
Set a ensemble of constraints for the orientation phase.
Definition: Miic.cpp:1067
Set< Arc > ArcSet
Some typdefs and define for shortcuts ...
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
std::size_t nbVariables() const noexcept
returns the number of variables (columns) of the database
void __createCorrectedMutualInformation()
create the Corrected Mutual Information instance for Miic/3off2
const ArcSet & arcs() const
returns the set of mandatory arcs
const DatabaseTable & databaseTable() const
returns the internal database table
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:48
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
void insert(const Key &k)
Inserts a new element into the set.
Definition: set_tpl.h:613
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ __readFile()

DatabaseTable gum::learning::genericBNLearner::__readFile ( const std::string &  filename,
const std::vector< std::string > &  missing_symbols 
)
staticprotected

reads a file and returns a databaseVectInRam

Definition at line 433 of file genericBNLearner.cpp.

References __checkFileName(), database(), gum::learning::IDBInitializer< ALLOC >::fillDatabase(), gum::learning::DBTranslatorSet< ALLOC >::insertTranslator(), gum::learning::DatabaseTable< ALLOC >::reorder(), gum::learning::DatabaseTable< ALLOC >::setVariableNames(), and gum::learning::IDBInitializer< ALLOC >::variableNames().

435  {
436  // get the extension of the file
437  __checkFileName(filename);
438 
439  DBInitializerFromCSV<> initializer(filename);
440 
441  const auto& var_names = initializer.variableNames();
442  const std::size_t nb_vars = var_names.size();
443 
444  DBTranslatorSet<> translator_set;
445  DBTranslator4LabelizedVariable<> translator(missing_symbols);
446  for (std::size_t i = 0; i < nb_vars; ++i) {
447  translator_set.insertTranslator(translator, i);
448  }
449 
450  DatabaseTable<> database(missing_symbols, translator_set);
451  database.setVariableNames(initializer.variableNames());
452  initializer.fillDatabase(database);
453 
454  database.reorder();
455 
456  return database;
457  }
static void __checkFileName(const std::string &filename)
checks whether the extension of a CSV filename is correct
virtual void setVariableNames(const std::vector< std::string, ALLOC< std::string > > &names, const bool from_external_object=true) final
sets the names of the variables
const DatabaseTable & database() const
returns the database used by the BNLearner
void reorder(const std::size_t k, const bool k_is_input_col=false)
performs a reordering of the kth translator or of the first translator parsing the kth column of the ...
+ Here is the call graph for this function:

◆ __setAprioriWeight()

INLINE void gum::learning::genericBNLearner::__setAprioriWeight ( double  weight)

sets the apriori weight

Definition at line 451 of file genericBNLearner_inl.h.

References __apriori_weight, checkScoreAprioriCompatibility(), GUM_ERROR, and gum::learning::genericBNLearner::Database::weight().

Referenced by useAprioriBDeu(), useAprioriDirichlet(), and useAprioriSmoothing().

451  {
452  if (weight < 0) {
453  GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive");
454  }
455 
456  __apriori_weight = weight;
458  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
double __apriori_weight
the weight of the apriori
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ addForbiddenArc() [1/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const Arc arc)

Definition at line 359 of file genericBNLearner_inl.h.

References __constraint_ForbiddenArcs, and gum::learning::StructuralConstraintForbiddenArcs::addArc().

Referenced by addForbiddenArc().

359  {
361  }
void addArc(const Arc &arc)
assign a new forbidden arc
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ addForbiddenArc() [2/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const NodeId  tail,
const NodeId  head 
)

Definition at line 369 of file genericBNLearner_inl.h.

References addForbiddenArc().

370  {
371  addForbiddenArc(Arc(tail, head));
372  }
+ Here is the call graph for this function:

◆ addForbiddenArc() [3/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const std::string &  tail,
const std::string &  head 
)

Definition at line 381 of file genericBNLearner_inl.h.

References addForbiddenArc(), and gum::learning::genericBNLearner::Database::idFromName().

382  {
383  addForbiddenArc(Arc(idFromName(tail), idFromName(head)));
384  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ addMandatoryArc() [1/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const Arc arc)

Definition at line 398 of file genericBNLearner_inl.h.

References __constraint_MandatoryArcs, and gum::learning::StructuralConstraintMandatoryArcs::addArc().

Referenced by addMandatoryArc().

398  {
400  }
void addArc(const Arc &arc)
assign a new forbidden arc
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ addMandatoryArc() [2/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const NodeId  tail,
const NodeId  head 
)

Definition at line 420 of file genericBNLearner_inl.h.

References addMandatoryArc().

421  {
422  addMandatoryArc(Arc(tail, head));
423  }
+ Here is the call graph for this function:

◆ addMandatoryArc() [3/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const std::string &  tail,
const std::string &  head 
)

Definition at line 408 of file genericBNLearner_inl.h.

References addMandatoryArc(), and gum::learning::genericBNLearner::Database::idFromName().

409  {
410  addMandatoryArc(Arc(idFromName(tail), idFromName(head)));
411  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ addPossibleEdge() [1/3]

INLINE void gum::learning::genericBNLearner::addPossibleEdge ( const Edge edge)

Definition at line 320 of file genericBNLearner_inl.h.

References __constraint_PossibleEdges, and gum::learning::StructuralConstraintPossibleEdges::addEdge().

Referenced by addPossibleEdge().

320  {
322  }
void addEdge(const Edge &edge)
assign a new forbidden arc
StructuralConstraintPossibleEdges __constraint_PossibleEdges
the constraint on possible Edges
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ addPossibleEdge() [2/3]

INLINE void gum::learning::genericBNLearner::addPossibleEdge ( const NodeId  tail,
const NodeId  head 
)

Definition at line 330 of file genericBNLearner_inl.h.

References addPossibleEdge().

331  {
332  addPossibleEdge(Edge(tail, head));
333  }
void addPossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ addPossibleEdge() [3/3]

INLINE void gum::learning::genericBNLearner::addPossibleEdge ( const std::string &  tail,
const std::string &  head 
)

Definition at line 342 of file genericBNLearner_inl.h.

References addPossibleEdge(), and gum::learning::genericBNLearner::Database::idFromName().

343  {
344  addPossibleEdge(Edge(idFromName(tail), idFromName(head)));
345  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
void addPossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ checkScoreAprioriCompatibility()

std::string gum::learning::genericBNLearner::checkScoreAprioriCompatibility ( )

checks whether the current score and apriori are compatible

Returns
a non empty string if the apriori is somehow compatible with the score.

Definition at line 866 of file genericBNLearner.cpp.

References __apriori_weight, __getAprioriType(), __score_type, AIC, BD, BDeu, BIC, gum::learning::ScoreAIC< ALLOC >::isAprioriCompatible(), gum::learning::ScoreBIC< ALLOC >::isAprioriCompatible(), gum::learning::ScoreLog2Likelihood< ALLOC >::isAprioriCompatible(), gum::learning::ScoreBDeu< ALLOC >::isAprioriCompatible(), gum::learning::ScoreK2< ALLOC >::isAprioriCompatible(), gum::learning::ScoreBD< ALLOC >::isAprioriCompatible(), K2, and LOG2LIKELIHOOD.

Referenced by __setAprioriWeight(), useAprioriBDeu(), useAprioriDirichlet(), useAprioriSmoothing(), useNoApriori(), useScoreAIC(), useScoreBD(), useScoreBDeu(), useScoreBIC(), useScoreK2(), and useScoreLog2Likelihood().

866  {
867  const std::string& apriori = __getAprioriType();
868 
869  switch (__score_type) {
870  case ScoreType::AIC:
872 
873  case ScoreType::BD:
875 
876  case ScoreType::BDeu:
878 
879  case ScoreType::BIC:
881 
882  case ScoreType::K2:
884 
888 
889  default: return "genericBNLearner does not support yet this score";
890  }
891  }
const std::string & __getAprioriType() const
returns the type (as a string) of a given apriori
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
ScoreType __score_type
the score selected for learning
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
double __apriori_weight
the weight of the apriori
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ chi2() [1/2]

std::pair< double, double > gum::learning::genericBNLearner::chi2 ( const NodeId  id1,
const NodeId  id2,
const std::vector< NodeId > &  knowing = {} 
)

Return the <statistic,pvalue> pair for chi2 test in the database.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 941 of file genericBNLearner.cpp.

References __apriori, __createApriori(), __score_database, databaseRanges(), gum::learning::genericBNLearner::Database::parser(), and gum::learning::IndepTestChi2< ALLOC >::statistics().

Referenced by chi2().

942  {
943  __createApriori();
946 
947  return chi2score.statistics(id1, id2, knowing);
948  }
Database __score_database
the database to be used by the scores and parameter estimators
the class for computing Chi2 independence test scores
Definition: indepTestChi2.h:48
std::pair< double, double > statistics(NodeId var1, NodeId var2, const std::vector< NodeId, ALLOC< NodeId > > &rhs_ids={})
get the pair <chi2 statistic,pvalue> for a test var1 indep var2 given rhs_ids
Apriori * __apriori
the apriori used
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
DBRowGeneratorParser & parser()
returns the parser for the database
void __createApriori()
create the apriori used for learning
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ chi2() [2/2]

std::pair< double, double > gum::learning::genericBNLearner::chi2 ( const std::string &  name1,
const std::string &  name2,
const std::vector< std::string > &  knowing = {} 
)

Return the <statistic,pvalue> pair for the BNLearner.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 951 of file genericBNLearner.cpp.

References chi2(), and idFromName().

953  {
954  std::vector< NodeId > knowingIds;
955  std::transform(
956  knowing.begin(),
957  knowing.end(),
958  std::back_inserter(knowingIds),
959  [this](const std::string& c) -> NodeId { return this->idFromName(c); });
960  return chi2(idFromName(name1), idFromName(name2), knowingIds);
961  }
std::pair< double, double > chi2(const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
Return the <statistic,pvalue> pair for chi2 test in the database.
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Size NodeId
Type for node ids.
Definition: graphElements.h:98
+ Here is the call graph for this function:

◆ clearDatabaseRanges()

INLINE void gum::learning::genericBNLearner::clearDatabaseRanges ( )

reset the ranges to the one range corresponding to the whole database

Definition at line 543 of file genericBNLearner_inl.h.

References __ranges.

543 { __ranges.clear(); }
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done

◆ currentTime()

double gum::learning::genericBNLearner::currentTime ( ) const
inlinevirtual

get the current running time in second (double)

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1035 of file genericBNLearner.h.

References __current_algorithm, gum::ApproximationScheme::currentTime(), and GUM_ERROR.

1035  {
1036  if (__current_algorithm != nullptr)
1037  return __current_algorithm->currentTime();
1038  else
1039  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1040  };
const ApproximationScheme * __current_algorithm
double currentTime() const
Returns the current running time in second.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ database()

INLINE const DatabaseTable & gum::learning::genericBNLearner::database ( ) const

returns the database used by the BNLearner

Definition at line 546 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::databaseTable().

Referenced by __readFile(), and gum::learning::readFile().

546  {
548  }
Database __score_database
the database to be used by the scores and parameter estimators
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ databaseRanges()

INLINE const std::vector< std::pair< std::size_t, std::size_t > > & gum::learning::genericBNLearner::databaseRanges ( ) const

returns the current database rows' ranges used for learning

Returns
The method returns a vector of pairs [Xi,Yi) of indices of rows in the database. The learning is performed on these set of rows.
Warning
an empty set of ranges means the whole database.

Definition at line 538 of file genericBNLearner_inl.h.

References __ranges.

Referenced by chi2(), G2(), and logLikelihood().

538  {
539  return __ranges;
540  }
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
+ Here is the caller graph for this function:

◆ databaseWeight()

INLINE double gum::learning::genericBNLearner::databaseWeight ( ) const

returns the weight of the whole database

Definition at line 173 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::weight().

173  {
174  return __score_database.weight();
175  }
Database __score_database
the database to be used by the scores and parameter estimators
double weight(const std::size_t i) const
returns the weight of the ith record
+ Here is the call graph for this function:

◆ disableEpsilon()

void gum::learning::genericBNLearner::disableEpsilon ( )
inlinevirtual

Disable stopping criterion on epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 900 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::disableEpsilon().

900  {
905  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void disableEpsilon()
Disable stopping criterion on epsilon.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ disableMaxIter()

void gum::learning::genericBNLearner::disableMaxIter ( )
inlinevirtual

Disable stopping criterion on max iterations.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 990 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::disableMaxIter().

990  {
995  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
void disableMaxIter()
Disable stopping criterion on max iterations.
+ Here is the call graph for this function:

◆ disableMaxTime()

void gum::learning::genericBNLearner::disableMaxTime ( )
inlinevirtual

Disable stopping criterion on timeout.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1043 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::disableMaxTime().

1043  {
1048  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
void disableMaxTime()
Disable stopping criterion on timeout.
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ disableMinEpsilonRate()

void gum::learning::genericBNLearner::disableMinEpsilonRate ( )
inlinevirtual

Disable stopping criterion on epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 946 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::disableMinEpsilonRate().

946  {
951  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void disableMinEpsilonRate()
Disable stopping criterion on epsilon rate.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ distributeProgress()

INLINE void gum::learning::genericBNLearner::distributeProgress ( const ApproximationScheme approximationScheme,
Size  pourcent,
double  error,
double  time 
)
inline

{@ /// distribute signals

Definition at line 862 of file genericBNLearner.h.

References GUM_EMIT3, gum::IApproximationSchemeConfiguration::onProgress, and setCurrentApproximationScheme().

Referenced by gum::learning::BNLearnerListener::whenProgress().

865  {
866  setCurrentApproximationScheme(approximationScheme);
867 
868  if (onProgress.hasListener()) GUM_EMIT3(onProgress, pourcent, error, time);
869  };
INLINE void setCurrentApproximationScheme(const ApproximationScheme *approximationScheme)
{@ /// distribute signals
Signaler3< Size, double, double > onProgress
Progression, error and time.
#define GUM_EMIT3(signal, arg1, arg2, arg3)
Definition: signaler3.h:42
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ distributeStop()

INLINE void gum::learning::genericBNLearner::distributeStop ( const ApproximationScheme approximationScheme,
std::string  message 
)
inline

distribute signals

Definition at line 872 of file genericBNLearner.h.

References GUM_EMIT1, gum::IApproximationSchemeConfiguration::onStop, and setCurrentApproximationScheme().

Referenced by gum::learning::BNLearnerListener::whenStop().

873  {
874  setCurrentApproximationScheme(approximationScheme);
875 
876  if (onStop.hasListener()) GUM_EMIT1(onStop, message);
877  };
INLINE void setCurrentApproximationScheme(const ApproximationScheme *approximationScheme)
{@ /// distribute signals
#define GUM_EMIT1(signal, arg1)
Definition: signaler1.h:42
Signaler1< std::string > onStop
Criteria messageApproximationScheme.
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ domainSizes()

INLINE const std::vector< std::size_t > & gum::learning::genericBNLearner::domainSizes ( ) const

returns the domain sizes of the variables in the database

Definition at line 532 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::domainSizes().

532  {
533  return __score_database.domainSizes();
534  }
Database __score_database
the database to be used by the scores and parameter estimators
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
+ Here is the call graph for this function:

◆ enableEpsilon()

void gum::learning::genericBNLearner::enableEpsilon ( )
inlinevirtual

Enable stopping criterion on epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 908 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::enableEpsilon().

908  {
913  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
void enableEpsilon()
Enable stopping criterion on epsilon.
+ Here is the call graph for this function:

◆ enableMaxIter()

void gum::learning::genericBNLearner::enableMaxIter ( )
inlinevirtual

Enable stopping criterion on max iterations.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 997 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::enableMaxIter().

997  {
1002  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
void enableMaxIter()
Enable stopping criterion on max iterations.
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ enableMaxTime()

void gum::learning::genericBNLearner::enableMaxTime ( )
inlinevirtual

stopping criterion on timeout If the criterion was disabled it will be enabled

Exceptions
OutOfLowerBoundif timeout<=0.0 timeout is time in second (double).

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1049 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::enableMaxTime().

1049  {
1054  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
void enableMaxTime()
Enable stopping criterion on timeout.
+ Here is the call graph for this function:

◆ enableMinEpsilonRate()

void gum::learning::genericBNLearner::enableMinEpsilonRate ( )
inlinevirtual

Enable stopping criterion on epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 953 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::enableMinEpsilonRate().

953  {
958  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void enableMinEpsilonRate()
Enable stopping criterion on epsilon rate.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ epsilon()

double gum::learning::genericBNLearner::epsilon ( ) const
inlinevirtual

Get the value of epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 892 of file genericBNLearner.h.

References __current_algorithm, gum::ApproximationScheme::epsilon(), and GUM_ERROR.

Referenced by useEM().

892  {
893  if (__current_algorithm != nullptr)
894  return __current_algorithm->epsilon();
895  else
896  GUM_ERROR(FatalError, "No chosen algorithm for learning");
897  };
const ApproximationScheme * __current_algorithm
double epsilon() const
Returns the value of epsilon.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ eraseForbiddenArc() [1/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const Arc arc)

Definition at line 364 of file genericBNLearner_inl.h.

References __constraint_ForbiddenArcs, and gum::learning::StructuralConstraintForbiddenArcs::eraseArc().

Referenced by eraseForbiddenArc().

364  {
366  }
void eraseArc(const Arc &arc)
remove a forbidden arc
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ eraseForbiddenArc() [2/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const NodeId  tail,
const NodeId  head 
)

Definition at line 375 of file genericBNLearner_inl.h.

References eraseForbiddenArc().

376  {
377  eraseForbiddenArc(Arc(tail, head));
378  }
+ Here is the call graph for this function:

◆ eraseForbiddenArc() [3/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const std::string &  tail,
const std::string &  head 
)

Definition at line 387 of file genericBNLearner_inl.h.

References eraseForbiddenArc(), and gum::learning::genericBNLearner::Database::idFromName().

388  {
389  eraseForbiddenArc(Arc(idFromName(tail), idFromName(head)));
390  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ eraseMandatoryArc() [1/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const Arc arc)

Definition at line 403 of file genericBNLearner_inl.h.

References __constraint_MandatoryArcs, and gum::learning::StructuralConstraintMandatoryArcs::eraseArc().

Referenced by eraseMandatoryArc().

403  {
405  }
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
void eraseArc(const Arc &arc)
remove a forbidden arc
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ eraseMandatoryArc() [2/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const NodeId  tail,
const NodeId  head 
)

Definition at line 426 of file genericBNLearner_inl.h.

References eraseMandatoryArc().

427  {
428  eraseMandatoryArc(Arc(tail, head));
429  }
+ Here is the call graph for this function:

◆ eraseMandatoryArc() [3/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const std::string &  tail,
const std::string &  head 
)

Definition at line 414 of file genericBNLearner_inl.h.

References eraseMandatoryArc(), and gum::learning::genericBNLearner::Database::idFromName().

415  {
416  eraseMandatoryArc(Arc(idFromName(tail), idFromName(head)));
417  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ erasePossibleEdge() [1/3]

INLINE void gum::learning::genericBNLearner::erasePossibleEdge ( const Edge edge)

Definition at line 325 of file genericBNLearner_inl.h.

References __constraint_PossibleEdges, and gum::learning::StructuralConstraintPossibleEdges::eraseEdge().

Referenced by erasePossibleEdge().

325  {
327  }
StructuralConstraintPossibleEdges __constraint_PossibleEdges
the constraint on possible Edges
void eraseEdge(const Edge &edge)
remove a forbidden arc
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ erasePossibleEdge() [2/3]

INLINE void gum::learning::genericBNLearner::erasePossibleEdge ( const NodeId  tail,
const NodeId  head 
)

Definition at line 336 of file genericBNLearner_inl.h.

References erasePossibleEdge().

337  {
338  erasePossibleEdge(Edge(tail, head));
339  }
void erasePossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ erasePossibleEdge() [3/3]

INLINE void gum::learning::genericBNLearner::erasePossibleEdge ( const std::string &  tail,
const std::string &  head 
)

Definition at line 348 of file genericBNLearner_inl.h.

References erasePossibleEdge(), and gum::learning::genericBNLearner::Database::idFromName().

349  {
350  erasePossibleEdge(Edge(idFromName(tail), idFromName(head)));
351  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
void erasePossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ G2() [1/2]

std::pair< double, double > gum::learning::genericBNLearner::G2 ( const NodeId  id1,
const NodeId  id2,
const std::vector< NodeId > &  knowing = {} 
)

Return the <statistic,pvalue> pair for for G2 test in the database.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 963 of file genericBNLearner.cpp.

References __apriori, __createApriori(), __score_database, databaseRanges(), gum::learning::genericBNLearner::Database::parser(), and gum::learning::IndepTestG2< ALLOC >::statistics().

Referenced by G2().

964  {
965  __createApriori();
968  return g2score.statistics(id1, id2, knowing);
969  }
Database __score_database
the database to be used by the scores and parameter estimators
the class for computing G2 independence test scores
Definition: indepTestG2.h:48
std::pair< double, double > statistics(NodeId var1, NodeId var2, const std::vector< NodeId, ALLOC< NodeId > > &rhs_ids={})
get the pair <G2statistic,pvalue> for a test var1 indep var2 given rhs_ids
Apriori * __apriori
the apriori used
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
DBRowGeneratorParser & parser()
returns the parser for the database
void __createApriori()
create the apriori used for learning
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ G2() [2/2]

std::pair< double, double > gum::learning::genericBNLearner::G2 ( const std::string &  name1,
const std::string &  name2,
const std::vector< std::string > &  knowing = {} 
)

Return the <statistic,pvalue> pair for for G2 test in the database.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 972 of file genericBNLearner.cpp.

References G2(), and idFromName().

974  {
975  std::vector< NodeId > knowingIds;
976  std::transform(
977  knowing.begin(),
978  knowing.end(),
979  std::back_inserter(knowingIds),
980  [this](const std::string& c) -> NodeId { return this->idFromName(c); });
981  return G2(idFromName(name1), idFromName(name2), knowingIds);
982  }
std::pair< double, double > G2(const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
Return the <statistic,pvalue> pair for for G2 test in the database.
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Size NodeId
Type for node ids.
Definition: graphElements.h:98
+ Here is the call graph for this function:

◆ hasMissingValues()

INLINE bool gum::learning::genericBNLearner::hasMissingValues ( ) const

returns true if the learner's database has missing values

Definition at line 306 of file genericBNLearner_inl.h.

References __score_database, gum::learning::genericBNLearner::Database::databaseTable(), and gum::learning::IDatabaseTable< T_DATA, ALLOC >::hasMissingValues().

306  {
308  }
Database __score_database
the database to be used by the scores and parameter estimators
bool hasMissingValues() const
indicates whether the database contains some missing values
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:

◆ history()

const std::vector< double >& gum::learning::genericBNLearner::history ( ) const
inlinevirtual
Exceptions
OperationNotAllowedif scheme not performed or verbosity=false

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1119 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::history().

1119  {
1120  if (__current_algorithm != nullptr)
1121  return __current_algorithm->history();
1122  else
1123  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1124  };
const ApproximationScheme * __current_algorithm
const std::vector< double > & history() const
Returns the scheme history.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ idFromName()

INLINE NodeId gum::learning::genericBNLearner::idFromName ( const std::string &  var_name) const

returns the node id corresponding to a variable name

Exceptions
MissingVariableInDatabaseif a variable of the BN is not found in the database.

Definition at line 147 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::idFromName().

Referenced by chi2(), G2(), and logLikelihood().

147  {
148  return __score_database.idFromName(var_name);
149  }
Database __score_database
the database to be used by the scores and parameter estimators
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ isEnabledEpsilon()

bool gum::learning::genericBNLearner::isEnabledEpsilon ( ) const
inlinevirtual
Returns
true if stopping criterion on epsilon is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 917 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::isEnabledEpsilon().

917  {
918  if (__current_algorithm != nullptr)
920  else
921  GUM_ERROR(FatalError, "No chosen algorithm for learning");
922  };
const ApproximationScheme * __current_algorithm
bool isEnabledEpsilon() const
Returns true if stopping criterion on epsilon is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ isEnabledMaxIter()

bool gum::learning::genericBNLearner::isEnabledMaxIter ( ) const
inlinevirtual
Returns
true if stopping criterion on max iterations is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1005 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::isEnabledMaxIter().

1005  {
1006  if (__current_algorithm != nullptr)
1008  else
1009  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1010  };
const ApproximationScheme * __current_algorithm
bool isEnabledMaxIter() const
Returns true if stopping criterion on max iterations is enabled, false otherwise. ...
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ isEnabledMaxTime()

bool gum::learning::genericBNLearner::isEnabledMaxTime ( ) const
inlinevirtual
Returns
true if stopping criterion on timeout is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1057 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::isEnabledMaxTime().

1057  {
1058  if (__current_algorithm != nullptr)
1060  else
1061  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1062  };
const ApproximationScheme * __current_algorithm
bool isEnabledMaxTime() const
Returns true if stopping criterion on timeout is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ isEnabledMinEpsilonRate()

bool gum::learning::genericBNLearner::isEnabledMinEpsilonRate ( ) const
inlinevirtual
Returns
true if stopping criterion on epsilon rate is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 961 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::isEnabledMinEpsilonRate().

961  {
962  if (__current_algorithm != nullptr)
964  else
965  GUM_ERROR(FatalError, "No chosen algorithm for learning");
966  };
const ApproximationScheme * __current_algorithm
bool isEnabledMinEpsilonRate() const
Returns true if stopping criterion on epsilon rate is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ latentVariables()

INLINE const std::vector< Arc > gum::learning::genericBNLearner::latentVariables ( ) const

get the list of arcs hiding latent variables

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 266 of file genericBNLearner_inl.h.

References __miic_3off2, __selected_algo, GUM_ERROR, gum::learning::Miic::latentVariables(), and MIIC_THREE_OFF_TWO.

266  {
268  GUM_ERROR(OperationNotAllowed,
269  "You must use the 3off2 algorithm before selecting "
270  << "the latentVariables method");
271  }
272  return __miic_3off2.latentVariables();
273  }
AlgoType __selected_algo
the selected learning algorithm
Miic __miic_3off2
the 3off2 algorithm
const std::vector< Arc > latentVariables() const
get the list of arcs hiding latent variables
Definition: Miic.cpp:1049
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ learnDAG()

DAG gum::learning::genericBNLearner::learnDAG ( )

learn a structure from a file (must have read the db before)

Definition at line 653 of file genericBNLearner.cpp.

References __createApriori(), __createScore(), and __learnDAG().

653  {
654  // create the score and the apriori
655  __createApriori();
656  __createScore();
657 
658  return __learnDAG();
659  }
void __createScore()
create the score used for learning
DAG __learnDAG()
returns the DAG learnt
void __createApriori()
create the apriori used for learning
+ Here is the call graph for this function:

◆ learnMixedStructure()

MixedGraph gum::learning::genericBNLearner::learnMixedStructure ( )

learn a partial structure from a file (must have read the db before and must have selected miic or 3off2)

Definition at line 635 of file genericBNLearner.cpp.

References __miic_3off2, __mutual_info, __prepare_miic_3off2(), __score_database, __selected_algo, gum::learning::genericBNLearner::Database::databaseTable(), GUM_ERROR, gum::learning::IDatabaseTable< T_DATA, ALLOC >::hasMissingValues(), gum::learning::Miic::learnMixedStructure(), and MIIC_THREE_OFF_TWO.

635  {
637  GUM_ERROR(OperationNotAllowed, "Must be using the miic/3off2 algorithm");
638  }
639  // check that the database does not contain any missing value
641  GUM_ERROR(MissingValueInDatabase,
642  "For the moment, the BNLearner is unable to learn "
643  << "structures with missing values in databases");
644  }
645  BNLearnerListener listener(this, __miic_3off2);
646 
647  // create the mixedGraph_constraint_MandatoryArcs.arcs();
648  MixedGraph mgraph = this->__prepare_miic_3off2();
649 
651  }
AlgoType __selected_algo
the selected learning algorithm
Database __score_database
the database to be used by the scores and parameter estimators
MixedGraph learnMixedStructure(CorrectedMutualInformation<> &I, MixedGraph graph)
learns the structure of an Essential Graph
Definition: Miic.cpp:113
Miic __miic_3off2
the 3off2 algorithm
MixedGraph __prepare_miic_3off2()
prepares the initial graph for 3off2 or miic
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
bool hasMissingValues() const
indicates whether the database contains some missing values
const DatabaseTable & databaseTable() const
returns the internal database table
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ logLikelihood() [1/2]

double gum::learning::genericBNLearner::logLikelihood ( const std::vector< NodeId > &  vars,
const std::vector< NodeId > &  knowing = {} 
)

Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.

Parameters
varsa vector of NodeIds
knowingan optional vector of conditioning NodeIds
Returns
a std::pair<double,double>

Definition at line 984 of file genericBNLearner.cpp.

References __apriori, __createApriori(), __score_database, databaseRanges(), gum::learning::genericBNLearner::Database::parser(), and gum::learning::ScoreLog2Likelihood< ALLOC >::score().

Referenced by logLikelihood().

985  {
986  __createApriori();
989 
990  std::vector< NodeId > total(vars);
991  total.insert(total.end(), knowing.begin(), knowing.end());
992  double LLtotal = ll2score.score(IdSet<>(total, false, true));
993  if (knowing.size() == (Size)0) {
994  return LLtotal;
995  } else {
996  double LLknw = ll2score.score(IdSet<>(knowing, false, true));
997  return LLtotal - LLknw;
998  }
999  }
Database __score_database
the database to be used by the scores and parameter estimators
the class for computing Log2-likelihood scores
Apriori * __apriori
the apriori used
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
DBRowGeneratorParser & parser()
returns the parser for the database
void __createApriori()
create the apriori used for learning
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:48
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ logLikelihood() [2/2]

double gum::learning::genericBNLearner::logLikelihood ( const std::vector< std::string > &  vars,
const std::vector< std::string > &  knowing = {} 
)

Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.

Parameters
varsa vector of name of rows
knowingan optional vector of conditioning rows
Returns
a std::pair<double,double>

Definition at line 1002 of file genericBNLearner.cpp.

References idFromName(), and logLikelihood().

1003  {
1004  std::vector< NodeId > ids;
1005  std::vector< NodeId > knowingIds;
1006 
1007  auto mapper = [this](const std::string& c) -> NodeId {
1008  return this->idFromName(c);
1009  };
1010 
1011  std::transform(vars.begin(), vars.end(), std::back_inserter(ids), mapper);
1012  std::transform(
1013  knowing.begin(), knowing.end(), std::back_inserter(knowingIds), mapper);
1014 
1015  return logLikelihood(ids, knowingIds);
1016  }
double logLikelihood(const std::vector< NodeId > &vars, const std::vector< NodeId > &knowing={})
Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Size NodeId
Type for node ids.
Definition: graphElements.h:98
+ Here is the call graph for this function:

◆ maxIter()

Size gum::learning::genericBNLearner::maxIter ( ) const
inlinevirtual
Returns
the criterion on number of iterations

Implements gum::IApproximationSchemeConfiguration.

Definition at line 982 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::maxIter().

982  {
983  if (__current_algorithm != nullptr)
984  return __current_algorithm->maxIter();
985  else
986  GUM_ERROR(FatalError, "No chosen algorithm for learning");
987  };
const ApproximationScheme * __current_algorithm
Size maxIter() const
Returns the criterion on number of iterations.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ maxTime()

double gum::learning::genericBNLearner::maxTime ( ) const
inlinevirtual

returns the timeout (in seconds)

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1027 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::maxTime().

1027  {
1028  if (__current_algorithm != nullptr)
1029  return __current_algorithm->maxTime();
1030  else
1031  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1032  };
double maxTime() const
Returns the timeout (in seconds).
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ messageApproximationScheme()

INLINE std::string gum::IApproximationSchemeConfiguration::messageApproximationScheme ( ) const
inherited

Returns the approximation scheme message.

Returns
Returns the approximation scheme message.

Definition at line 40 of file IApproximationSchemeConfiguration_inl.h.

References gum::IApproximationSchemeConfiguration::Continue, gum::IApproximationSchemeConfiguration::Epsilon, gum::IApproximationSchemeConfiguration::epsilon(), gum::IApproximationSchemeConfiguration::Limit, gum::IApproximationSchemeConfiguration::maxIter(), gum::IApproximationSchemeConfiguration::maxTime(), gum::IApproximationSchemeConfiguration::minEpsilonRate(), gum::IApproximationSchemeConfiguration::Rate, gum::IApproximationSchemeConfiguration::stateApproximationScheme(), gum::IApproximationSchemeConfiguration::Stopped, gum::IApproximationSchemeConfiguration::TimeLimit, and gum::IApproximationSchemeConfiguration::Undefined.

Referenced by gum::ApproximationScheme::_stopScheme(), gum::ApproximationScheme::continueApproximationScheme(), and gum::credal::InferenceEngine< GUM_SCALAR >::getApproximationSchemeMsg().

40  {
41  std::stringstream s;
42 
43  switch (stateApproximationScheme()) {
44  case ApproximationSchemeSTATE::Continue: s << "in progress"; break;
45 
47  s << "stopped with epsilon=" << epsilon();
48  break;
49 
51  s << "stopped with rate=" << minEpsilonRate();
52  break;
53 
55  s << "stopped with max iteration=" << maxIter();
56  break;
57 
59  s << "stopped with timeout=" << maxTime();
60  break;
61 
62  case ApproximationSchemeSTATE::Stopped: s << "stopped on request"; break;
63 
64  case ApproximationSchemeSTATE::Undefined: s << "undefined state"; break;
65  };
66 
67  return s.str();
68  }
virtual double epsilon() const =0
Returns the value of epsilon.
virtual ApproximationSchemeSTATE stateApproximationScheme() const =0
Returns the approximation scheme state.
virtual double maxTime() const =0
Returns the timeout (in seconds).
virtual Size maxIter() const =0
Returns the criterion on number of iterations.
virtual double minEpsilonRate() const =0
Returns the value of the minimal epsilon rate.
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ minEpsilonRate()

double gum::learning::genericBNLearner::minEpsilonRate ( ) const
inlinevirtual

Get the value of the minimal epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 938 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::minEpsilonRate().

938  {
939  if (__current_algorithm != nullptr)
941  else
942  GUM_ERROR(FatalError, "No chosen algorithm for learning");
943  };
double minEpsilonRate() const
Returns the value of the minimal epsilon rate.
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ nameFromId()

INLINE const std::string & gum::learning::genericBNLearner::nameFromId ( NodeId  id) const

returns the variable name corresponding to a given node id

Definition at line 152 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::nameFromId().

152  {
153  return __score_database.nameFromId(id);
154  }
Database __score_database
the database to be used by the scores and parameter estimators
const std::string & nameFromId(NodeId id) const
returns the variable name corresponding to a given node id
+ Here is the call graph for this function:

◆ names()

INLINE const std::vector< std::string > & gum::learning::genericBNLearner::names ( ) const

returns the names of the variables in the database

Definition at line 526 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::names().

526  {
527  return __score_database.names();
528  }
Database __score_database
the database to be used by the scores and parameter estimators
const std::vector< std::string > & names() const
returns the names of the variables in the database
+ Here is the call graph for this function:

◆ nbCols()

INLINE Size gum::learning::genericBNLearner::nbCols ( ) const
Returns
the number of cols in the database

Definition at line 550 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::domainSizes().

550  {
551  return __score_database.domainSizes().size();
552  }
Database __score_database
the database to be used by the scores and parameter estimators
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
+ Here is the call graph for this function:

◆ nbrIterations()

Size gum::learning::genericBNLearner::nbrIterations ( ) const
inlinevirtual
Exceptions
OperationNotAllowedif scheme not performed

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1111 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::nbrIterations().

1111  {
1112  if (__current_algorithm != nullptr)
1114  else
1115  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1116  };
Size nbrIterations() const
Returns the number of iterations.
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ nbRows()

INLINE Size gum::learning::genericBNLearner::nbRows ( ) const
Returns
the number of rows in the database

Definition at line 554 of file genericBNLearner_inl.h.

References __score_database, gum::learning::genericBNLearner::Database::databaseTable(), and gum::learning::IDatabaseTable< T_DATA, ALLOC >::size().

554  {
556  }
Database __score_database
the database to be used by the scores and parameter estimators
std::size_t size() const noexcept
returns the number of records (rows) in the database
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:

◆ operator=() [1/2]

genericBNLearner & gum::learning::genericBNLearner::operator= ( const genericBNLearner from)

copy operator

Definition at line 274 of file genericBNLearner.cpp.

References __3off2_kmode, __apriori, __apriori_database, __apriori_dbname, __apriori_type, __apriori_weight, __constraint_ForbiddenArcs, __constraint_Indegree, __constraint_MandatoryArcs, __constraint_SliceOrder, __constraint_TabuList, __current_algorithm, __EMepsilon, __greedy_hill_climbing, __initial_dag, __K2, __local_search_with_tabu_list, __miic_3off2, __mutual_info, __param_estimator_type, __ranges, __score, __score_database, __score_type, and __selected_algo.

274  {
275  if (this != &from) {
276  if (__score) {
277  delete __score;
278  __score = nullptr;
279  }
280 
281  if (__apriori) {
282  delete __apriori;
283  __apriori = nullptr;
284  }
285 
286  if (__apriori_database) {
287  delete __apriori_database;
288  __apriori_database = nullptr;
289  }
290 
291  if (__mutual_info) {
292  delete __mutual_info;
293  __mutual_info = nullptr;
294  }
295 
296  __score_type = from.__score_type;
297  __param_estimator_type = from.__param_estimator_type;
298  __EMepsilon = from.__EMepsilon;
299  __apriori_type = from.__apriori_type;
300  __apriori_weight = from.__apriori_weight;
301  __constraint_SliceOrder = from.__constraint_SliceOrder;
302  __constraint_Indegree = from.__constraint_Indegree;
303  __constraint_TabuList = from.__constraint_TabuList;
304  __constraint_ForbiddenArcs = from.__constraint_ForbiddenArcs;
305  __constraint_MandatoryArcs = from.__constraint_MandatoryArcs;
306  __selected_algo = from.__selected_algo;
307  __K2 = from.__K2;
308  __miic_3off2 = from.__miic_3off2;
309  __3off2_kmode = from.__3off2_kmode;
310  __greedy_hill_climbing = from.__greedy_hill_climbing;
311  __local_search_with_tabu_list = from.__local_search_with_tabu_list;
312  __score_database = from.__score_database;
313  __ranges = from.__ranges;
314  __apriori_dbname = from.__apriori_dbname;
315  __initial_dag = from.__initial_dag;
316  __current_algorithm = nullptr;
317  }
318 
319  return *this;
320  }
AlgoType __selected_algo
the selected learning algorithm
Score * __score
the score used
Database __score_database
the database to be used by the scores and parameter estimators
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
double __EMepsilon
epsilon for EM. if espilon=0.0 : no EM
StructuralConstraintSliceOrder __constraint_SliceOrder
the constraint for 2TBNs
Miic __miic_3off2
the 3off2 algorithm
ParamEstimatorType __param_estimator_type
the type of the parameter estimator
AprioriType __apriori_type
the a priori selected for the score and parameters
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
DAG __initial_dag
an initial DAG given to learners
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
Database * __apriori_database
the database used by the Dirichlet a priori
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
const ApproximationScheme * __current_algorithm
Apriori * __apriori
the apriori used
StructuralConstraintTabuList __constraint_TabuList
the constraint for tabu lists
std::string __apriori_dbname
the filename for the Dirichlet a priori, if any
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
StructuralConstraintIndegree __constraint_Indegree
the constraint for indegrees
ScoreType __score_type
the score selected for learning
double __apriori_weight
the weight of the apriori
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs

◆ operator=() [2/2]

genericBNLearner & gum::learning::genericBNLearner::operator= ( genericBNLearner &&  from)

move operator

Definition at line 322 of file genericBNLearner.cpp.

References __3off2_kmode, __apriori, __apriori_database, __apriori_dbname, __apriori_type, __apriori_weight, __constraint_ForbiddenArcs, __constraint_Indegree, __constraint_MandatoryArcs, __constraint_SliceOrder, __constraint_TabuList, __current_algorithm, __EMepsilon, __greedy_hill_climbing, __initial_dag, __K2, __local_search_with_tabu_list, __miic_3off2, __mutual_info, __param_estimator_type, __ranges, __score, __score_database, __score_type, and __selected_algo.

322  {
323  if (this != &from) {
324  if (__score) {
325  delete __score;
326  __score = nullptr;
327  }
328 
329  if (__apriori) {
330  delete __apriori;
331  __apriori = nullptr;
332  }
333 
334  if (__apriori_database) {
335  delete __apriori_database;
336  __apriori_database = nullptr;
337  }
338 
339  if (__mutual_info) {
340  delete __mutual_info;
341  __mutual_info = nullptr;
342  }
343 
344  __score_type = from.__score_type;
345  __param_estimator_type = from.__param_estimator_type;
346  __EMepsilon = from.__EMepsilon;
347  __apriori_type = from.__apriori_type;
348  __apriori_weight = from.__apriori_weight;
349  __constraint_SliceOrder = std::move(from.__constraint_SliceOrder);
350  __constraint_Indegree = std::move(from.__constraint_Indegree);
351  __constraint_TabuList = std::move(from.__constraint_TabuList);
352  __constraint_ForbiddenArcs = std::move(from.__constraint_ForbiddenArcs);
353  __constraint_MandatoryArcs = std::move(from.__constraint_MandatoryArcs);
354  __selected_algo = from.__selected_algo;
355  __K2 = from.__K2;
356  __miic_3off2 = std::move(from.__miic_3off2);
357  __3off2_kmode = from.__3off2_kmode;
358  __greedy_hill_climbing = std::move(from.__greedy_hill_climbing);
360  std::move(from.__local_search_with_tabu_list);
361  __score_database = std::move(from.__score_database);
362  __ranges = std::move(from.__ranges);
363  __apriori_dbname = std::move(from.__apriori_dbname);
364  __initial_dag = std::move(from.__initial_dag);
365  __current_algorithm = nullptr;
366  }
367 
368  return *this;
369  }
AlgoType __selected_algo
the selected learning algorithm
Score * __score
the score used
Database __score_database
the database to be used by the scores and parameter estimators
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
double __EMepsilon
epsilon for EM. if espilon=0.0 : no EM
StructuralConstraintSliceOrder __constraint_SliceOrder
the constraint for 2TBNs
Miic __miic_3off2
the 3off2 algorithm
ParamEstimatorType __param_estimator_type
the type of the parameter estimator
AprioriType __apriori_type
the a priori selected for the score and parameters
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
DAG __initial_dag
an initial DAG given to learners
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
Database * __apriori_database
the database used by the Dirichlet a priori
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
const ApproximationScheme * __current_algorithm
Apriori * __apriori
the apriori used
StructuralConstraintTabuList __constraint_TabuList
the constraint for tabu lists
std::string __apriori_dbname
the filename for the Dirichlet a priori, if any
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
StructuralConstraintIndegree __constraint_Indegree
the constraint for indegrees
ScoreType __score_type
the score selected for learning
double __apriori_weight
the weight of the apriori
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs

◆ periodSize()

Size gum::learning::genericBNLearner::periodSize ( ) const
inlinevirtual

how many samples between 2 stopping isEnableds

Exceptions
OutOfLowerBoundif p<1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1075 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::periodSize().

1075  {
1076  if (__current_algorithm != nullptr)
1077  return __current_algorithm->periodSize();
1078  else
1079  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1080  };
Size periodSize() const
Returns the period size.
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ recordWeight()

INLINE double gum::learning::genericBNLearner::recordWeight ( const std::size_t  i) const

returns the weight of the ith record

Exceptions
OutOfBoundsif i is outside the set of indices of the records

Definition at line 168 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::weight().

168  {
169  return __score_database.weight(i);
170  }
Database __score_database
the database to be used by the scores and parameter estimators
double weight(const std::size_t i) const
returns the weight of the ith record
+ Here is the call graph for this function:

◆ setCurrentApproximationScheme()

INLINE void gum::learning::genericBNLearner::setCurrentApproximationScheme ( const ApproximationScheme approximationScheme)
inline

{@ /// distribute signals

Definition at line 856 of file genericBNLearner.h.

References __current_algorithm.

Referenced by gum::learning::BNLearnerListener::BNLearnerListener(), distributeProgress(), and distributeStop().

857  {
858  __current_algorithm = approximationScheme;
859  }
const ApproximationScheme * __current_algorithm
+ Here is the caller graph for this function:

◆ setDatabaseWeight()

INLINE void gum::learning::genericBNLearner::setDatabaseWeight ( const double  new_weight)

assign a weight to all the rows of the learning database so that the sum of their weights is equal to new_weight

assign new weight to the rows of the learning database

Definition at line 157 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::setDatabaseWeight().

157  {
159  }
Database __score_database
the database to be used by the scores and parameter estimators
void setDatabaseWeight(const double new_weight)
assign a weight to all the rows of the database so that the sum of their weights is equal to new_weig...
+ Here is the call graph for this function:

◆ setEpsilon()

void gum::learning::genericBNLearner::setEpsilon ( double  eps)
inlinevirtual

Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)| If the criterion was disabled it will be enabled.

Exceptions
OutOfLowerBoundif eps<0

Implements gum::IApproximationSchemeConfiguration.

Definition at line 884 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setEpsilon().

884  {
888  __Dag2BN.setEpsilon(eps);
889  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
void setEpsilon(double eps)
Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)|.
+ Here is the call graph for this function:

◆ setForbiddenArcs()

INLINE void gum::learning::genericBNLearner::setForbiddenArcs ( const ArcSet set)

assign a set of forbidden arcs

Definition at line 354 of file genericBNLearner_inl.h.

References __constraint_ForbiddenArcs, and gum::learning::StructuralConstraintForbiddenArcs::setArcs().

354  {
356  }
void setArcs(const ArcSet &set)
assign a set of forbidden arcs
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ setInitialDAG()

INLINE void gum::learning::genericBNLearner::setInitialDAG ( const DAG dag)

sets an initial DAG structure

Definition at line 178 of file genericBNLearner_inl.h.

References __initial_dag.

178  {
179  __initial_dag = dag;
180  }
DAG __initial_dag
an initial DAG given to learners

◆ setMandatoryArcs()

INLINE void gum::learning::genericBNLearner::setMandatoryArcs ( const ArcSet set)

assign a set of forbidden arcs

Definition at line 393 of file genericBNLearner_inl.h.

References __constraint_MandatoryArcs, and gum::learning::StructuralConstraintMandatoryArcs::setArcs().

393  {
395  }
void setArcs(const ArcSet &set)
assign a set of forbidden arcs
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ setMaxIndegree()

INLINE void gum::learning::genericBNLearner::setMaxIndegree ( Size  max_indegree)

sets the max indegree

Definition at line 219 of file genericBNLearner_inl.h.

References __constraint_Indegree, and gum::learning::StructuralConstraintIndegree::setMaxIndegree().

219  {
221  }
void setMaxIndegree(Size max_indegree, bool update_all_node=false)
resets the default max indegree and possibly updates the indegree of all nodes
StructuralConstraintIndegree __constraint_Indegree
the constraint for indegrees
+ Here is the call graph for this function:

◆ setMaxIter()

void gum::learning::genericBNLearner::setMaxIter ( Size  max)
inlinevirtual

stopping criterion on number of iterationsIf the criterion was disabled it will be enabled

Parameters
maxThe maximum number of iterations
Exceptions
OutOfLowerBoundif max<=1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 974 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setMaxIter().

974  {
978  __Dag2BN.setMaxIter(max);
979  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
void setMaxIter(Size max)
Stopping criterion on number of iterations.
+ Here is the call graph for this function:

◆ setMaxTime()

void gum::learning::genericBNLearner::setMaxTime ( double  timeout)
inlinevirtual

stopping criterion on timeout If the criterion was disabled it will be enabled

Exceptions
OutOfLowerBoundif timeout<=0.0 timeout is time in second (double).

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1019 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setMaxTime().

1019  {
1020  __K2.approximationScheme().setMaxTime(timeout);
1023  __Dag2BN.setMaxTime(timeout);
1024  }
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setMaxTime(double timeout)
Stopping criterion on timeout.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ setMinEpsilonRate()

void gum::learning::genericBNLearner::setMinEpsilonRate ( double  rate)
inlinevirtual

Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|) If the criterion was disabled it will be enabled.

Exceptions
OutOfLowerBoundif rate<0

Implements gum::IApproximationSchemeConfiguration.

Definition at line 930 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setMinEpsilonRate().

930  {
935  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setMinEpsilonRate(double rate)
Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|).
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ setPeriodSize()

void gum::learning::genericBNLearner::setPeriodSize ( Size  p)
inlinevirtual

how many samples between 2 stopping isEnableds

Exceptions
OutOfLowerBoundif p<1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1068 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setPeriodSize().

1068  {
1073  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setPeriodSize(Size p)
How many samples between two stopping is enable.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ setPossibleEdges()

INLINE void gum::learning::genericBNLearner::setPossibleEdges ( const EdgeSet set)

assign a set of forbidden edges

Warning
Once at least one possible edge is defined, all other edges are not possible anymore

Definition at line 311 of file genericBNLearner_inl.h.

References __constraint_PossibleEdges, and gum::learning::StructuralConstraintPossibleEdges::setEdges().

Referenced by setPossibleSkeleton().

311  {
313  }
void setEdges(const EdgeSet &set)
assign a set of forbidden arcs
StructuralConstraintPossibleEdges __constraint_PossibleEdges
the constraint on possible Edges
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ setPossibleSkeleton()

INLINE void gum::learning::genericBNLearner::setPossibleSkeleton ( const UndiGraph skeleton)

assign a set of forbidden edges

Warning
Once at least one possible edge is defined, all other edges are not possible anymore

Definition at line 315 of file genericBNLearner_inl.h.

References gum::EdgeGraphPart::edges(), and setPossibleEdges().

315  {
316  setPossibleEdges(g.edges());
317  }
void setPossibleEdges(const EdgeSet &set)
assign a set of forbidden edges
+ Here is the call graph for this function:

◆ setRecordWeight()

INLINE void gum::learning::genericBNLearner::setRecordWeight ( const std::size_t  i,
const double  weight 
)

sets the weight of the ith record of the database

assign new weight to the ith row of the learning database

Exceptions
OutOfBoundsif i is outside the set of indices of the records or if the weight is negative

Definition at line 162 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::setWeight().

163  {
164  __score_database.setWeight(i, new_weight);
165  }
Database __score_database
the database to be used by the scores and parameter estimators
void setWeight(const std::size_t i, const double weight)
sets the weight of the ith record
+ Here is the call graph for this function:

◆ setSliceOrder() [1/2]

INLINE void gum::learning::genericBNLearner::setSliceOrder ( const NodeProperty< NodeId > &  slice_order)

sets a partial order on the nodes

Parameters
slice_ordera NodeProperty given the rank (priority) of nodes in the partial order

Definition at line 433 of file genericBNLearner_inl.h.

References __constraint_SliceOrder.

Referenced by setSliceOrder().

433  {
434  __constraint_SliceOrder = StructuralConstraintSliceOrder(slice_order);
435  }
StructuralConstraintSliceOrder __constraint_SliceOrder
the constraint for 2TBNs
+ Here is the caller graph for this function:

◆ setSliceOrder() [2/2]

INLINE void gum::learning::genericBNLearner::setSliceOrder ( const std::vector< std::vector< std::string > > &  slices)

sets a partial order on the nodes

Parameters
slicesthe list of list of variable names

Definition at line 437 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::idFromName(), gum::HashTable< Key, Val, Alloc >::insert(), and setSliceOrder().

438  {
439  NodeProperty< NodeId > slice_order;
440  NodeId rank = 0;
441  for (const auto& slice : slices) {
442  for (const auto& name : slice) {
443  slice_order.insert(idFromName(name), rank);
444  }
445  rank++;
446  }
447  setSliceOrder(slice_order);
448  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
void setSliceOrder(const NodeProperty< NodeId > &slice_order)
sets a partial order on the nodes
Size NodeId
Type for node ids.
Definition: graphElements.h:98
+ Here is the call graph for this function:

◆ setVerbosity()

void gum::learning::genericBNLearner::setVerbosity ( bool  v)
inlinevirtual

verbosity

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1085 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setVerbosity().

1085  {
1090  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setVerbosity(bool v)
Set the verbosity on (true) or off (false).
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ stateApproximationScheme()

ApproximationSchemeSTATE gum::learning::genericBNLearner::stateApproximationScheme ( ) const
inlinevirtual

history

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1103 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::stateApproximationScheme().

1103  {
1104  if (__current_algorithm != nullptr)
1106  else
1107  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1108  };
const ApproximationScheme * __current_algorithm
ApproximationSchemeSTATE stateApproximationScheme() const
Returns the approximation scheme state.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ use3off2()

INLINE void gum::learning::genericBNLearner::use3off2 ( )

indicate that we wish to use 3off2

Definition at line 224 of file genericBNLearner_inl.h.

References __miic_3off2, __selected_algo, MIIC_THREE_OFF_TWO, and gum::learning::Miic::set3off2Behaviour().

224  {
227  }
AlgoType __selected_algo
the selected learning algorithm
void set3off2Behaviour()
Sets the orientation phase to follow the one of the 3off2 algorithm.
Definition: Miic.cpp:1065
Miic __miic_3off2
the 3off2 algorithm
+ Here is the call graph for this function:

◆ useAprioriBDeu()

INLINE void gum::learning::genericBNLearner::useAprioriBDeu ( double  weight = 1)

use the BDeu apriori

The BDeu apriori adds weight to all the cells of the countings tables. In other words, it adds weight rows in the database with equally probable values.

Definition at line 494 of file genericBNLearner_inl.h.

References __apriori_type, __setAprioriWeight(), BDEU, checkScoreAprioriCompatibility(), and GUM_ERROR.

494  {
495  if (weight < 0) {
496  GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive");
497  }
498 
500  __setAprioriWeight(weight);
501 
503  }
AprioriType __apriori_type
the a priori selected for the score and parameters
void __setAprioriWeight(double weight)
sets the apriori weight
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ useAprioriDirichlet()

INLINE void gum::learning::genericBNLearner::useAprioriDirichlet ( const std::string &  filename,
double  weight = 1 
)

use the Dirichlet apriori

Definition at line 479 of file genericBNLearner_inl.h.

References __apriori_dbname, __apriori_type, __setAprioriWeight(), checkScoreAprioriCompatibility(), DIRICHLET_FROM_DATABASE, and GUM_ERROR.

480  {
481  if (weight < 0) {
482  GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive");
483  }
484 
485  __apriori_dbname = filename;
487  __setAprioriWeight(weight);
488 
490  }
AprioriType __apriori_type
the a priori selected for the score and parameters
std::string __apriori_dbname
the filename for the Dirichlet a priori, if any
void __setAprioriWeight(double weight)
sets the apriori weight
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ useAprioriSmoothing()

INLINE void gum::learning::genericBNLearner::useAprioriSmoothing ( double  weight = 1)

use the apriori smoothing

Parameters
weightpass in argument a weight if you wish to assign a weight to the smoothing, else the current weight of the genericBNLearner will be used.

Definition at line 467 of file genericBNLearner_inl.h.

References __apriori_type, __setAprioriWeight(), checkScoreAprioriCompatibility(), GUM_ERROR, and SMOOTHING.

467  {
468  if (weight < 0) {
469  GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive");
470  }
471 
473  __setAprioriWeight(weight);
474 
476  }
AprioriType __apriori_type
the a priori selected for the score and parameters
void __setAprioriWeight(double weight)
sets the apriori weight
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ useCrossValidationFold()

std::pair< std::size_t, std::size_t > gum::learning::genericBNLearner::useCrossValidationFold ( const std::size_t  learning_fold,
const std::size_t  k_fold 
)

sets the ranges of rows to be used for cross-validation learning

When applied on (x,k), the method indicates to the subsequent learnings that they should be performed on the xth fold in a k-fold cross-validation context. For instance, if a database has 1000 rows, and if we perform a 10-fold cross-validation, then, the first learning fold (learning_fold=0) corresponds to rows interval [100,1000) and the test dataset corresponds to [0,100). The second learning fold (learning_fold=1) is [0,100) U [200,1000) and the corresponding test dataset is [100,200).

Parameters
learning_folda number indicating the set of rows used for learning. If N denotes the size of the database, and k_fold represents the number of folds in the cross validation, then the set of rows used for testing is [learning_fold * N / k_fold, (learning_fold+1) * N / k_fold) and the learning database is the complement in the database
k_foldthe value of "k" in k-fold cross validation
Returns
a pair [x,y) of rows' indices that corresponds to the indices of rows in the original database that constitute the test dataset
Exceptions
OutOfBoundsis raised if k_fold is equal to 0 or learning_fold is greater than or eqal to k_fold, or if k_fold is greater than or equal to the size of the database.

Definition at line 896 of file genericBNLearner.cpp.

References __ranges, __score_database, gum::learning::genericBNLearner::Database::databaseTable(), GUM_ERROR, and gum::learning::IDatabaseTable< T_DATA, ALLOC >::nbRows().

897  {
898  if (k_fold == 0) {
899  GUM_ERROR(OutOfBounds, "K-fold cross validation with k=0 is forbidden");
900  }
901 
902  if (learning_fold >= k_fold) {
903  GUM_ERROR(OutOfBounds,
904  "In " << k_fold << "-fold cross validation, the learning "
905  << "fold should be strictly lower than " << k_fold
906  << " but, here, it is equal to " << learning_fold);
907  }
908 
909  const std::size_t db_size = __score_database.databaseTable().nbRows();
910  if (k_fold >= db_size) {
911  GUM_ERROR(OutOfBounds,
912  "In " << k_fold << "-fold cross validation, the database's "
913  << "size should be strictly greater than " << k_fold
914  << " but, here, the database has only " << db_size
915  << "rows");
916  }
917 
918  // create the ranges of rows of the test database
919  const std::size_t foldSize = db_size / k_fold;
920  const std::size_t unfold_deb = learning_fold * foldSize;
921  const std::size_t unfold_end = unfold_deb + foldSize;
922 
923  __ranges.clear();
924  if (learning_fold == std::size_t(0)) {
925  __ranges.push_back(
926  std::pair< std::size_t, std::size_t >(unfold_end, db_size));
927  } else {
928  __ranges.push_back(
929  std::pair< std::size_t, std::size_t >(std::size_t(0), unfold_deb));
930 
931  if (learning_fold != k_fold - 1) {
932  __ranges.push_back(
933  std::pair< std::size_t, std::size_t >(unfold_end, db_size));
934  }
935  }
936 
937  return std::pair< std::size_t, std::size_t >(unfold_deb, unfold_end);
938  }
Database __score_database
the database to be used by the scores and parameter estimators
std::size_t nbRows() const noexcept
returns the number of records (rows) in the database
const DatabaseTable & databaseTable() const
returns the internal database table
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

◆ useDatabaseRanges()

template<template< typename > class XALLOC>
void gum::learning::genericBNLearner::useDatabaseRanges ( const std::vector< std::pair< std::size_t, std::size_t >, XALLOC< std::pair< std::size_t, std::size_t > > > &  new_ranges)

use a new set of database rows' ranges to perform learning

Parameters
rangesa set of pairs {(X1,Y1),...,(Xn,Yn)} of database's rows indices. The subsequent learnings are then performed only on the union of the rows [Xi,Yi), i in {1,...,n}. This is useful, e.g, when performing cross validation tasks, in which part of the database should be ignored. An empty set of ranges is equivalent to an interval [X,Y) ranging over the whole database.

Definition at line 101 of file genericBNLearner_tpl.h.

References __no_apriori, __ranges, __score_database, gum::learning::genericBNLearner::Database::parser(), and gum::learning::Score< ALLOC >::setRanges().

104  {
105  // use a score to detect whether the ranges are ok
106  ScoreLog2Likelihood<> score(__score_database.parser(), *__no_apriori);
107  score.setRanges(new_ranges);
108  __ranges = score.ranges();
109  }
Database __score_database
the database to be used by the scores and parameter estimators
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
DBRowGeneratorParser & parser()
returns the parser for the database
+ Here is the call graph for this function:

◆ useEM()

INLINE void gum::learning::genericBNLearner::useEM ( const double  epsilon)

use The EM algorithm to learn paramters

if epsilon=0, EM is not used

Definition at line 301 of file genericBNLearner_inl.h.

References __EMepsilon, and epsilon().

301  {
303  }
double __EMepsilon
epsilon for EM. if espilon=0.0 : no EM
double epsilon() const
Get the value of epsilon.
+ Here is the call graph for this function:

◆ useGreedyHillClimbing()

INLINE void gum::learning::genericBNLearner::useGreedyHillClimbing ( )

indicate that we wish to use a greedy hill climbing algorithm

Definition at line 288 of file genericBNLearner_inl.h.

References __selected_algo, and GREEDY_HILL_CLIMBING.

◆ useK2() [1/2]

INLINE void gum::learning::genericBNLearner::useK2 ( const Sequence< NodeId > &  order)

indicate that we wish to use K2

Definition at line 276 of file genericBNLearner_inl.h.

References __K2, __selected_algo, K2, and gum::learning::K2::setOrder().

276  {
278  __K2.setOrder(order);
279  }
AlgoType __selected_algo
the selected learning algorithm
void setOrder(const Sequence< NodeId > &order)
sets the order on the variables
+ Here is the call graph for this function:

◆ useK2() [2/2]

INLINE void gum::learning::genericBNLearner::useK2 ( const std::vector< NodeId > &  order)

indicate that we wish to use K2

Definition at line 282 of file genericBNLearner_inl.h.

References __K2, __selected_algo, K2, and gum::learning::K2::setOrder().

282  {
284  __K2.setOrder(order);
285  }
AlgoType __selected_algo
the selected learning algorithm
void setOrder(const Sequence< NodeId > &order)
sets the order on the variables
+ Here is the call graph for this function:

◆ useLocalSearchWithTabuList()

INLINE void gum::learning::genericBNLearner::useLocalSearchWithTabuList ( Size  tabu_size = 100,
Size  nb_decrease = 2 
)

indicate that we wish to use a local search with tabu list

Parameters
tabu_sizeindicate the size of the tabu list
nb_decreaseindicate the max number of changes decreasing the score consecutively that we allow to apply

Definition at line 293 of file genericBNLearner_inl.h.

References __constraint_TabuList, __local_search_with_tabu_list, __selected_algo, LOCAL_SEARCH_WITH_TABU_LIST, gum::learning::LocalSearchWithTabuList::setMaxNbDecreasingChanges(), and gum::learning::StructuralConstraintTabuList::setTabuListSize().

294  {
298  }
AlgoType __selected_algo
the selected learning algorithm
void setTabuListSize(Size new_size)
sets the size of the tabu list
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
StructuralConstraintTabuList __constraint_TabuList
the constraint for tabu lists
void setMaxNbDecreasingChanges(Size nb)
set the max number of changes decreasing the score that we allow to apply
+ Here is the call graph for this function:

◆ useMDL()

INLINE void gum::learning::genericBNLearner::useMDL ( )

indicate that we wish to use the MDL correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 246 of file genericBNLearner_inl.h.

References __3off2_kmode, __selected_algo, GUM_ERROR, and MIIC_THREE_OFF_TWO.

246  {
248  GUM_ERROR(OperationNotAllowed,
249  "You must use the 3off2 algorithm before selecting "
250  << "the MDL score");
251  }
253  }
AlgoType __selected_algo
the selected learning algorithm
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55

◆ useMIIC()

INLINE void gum::learning::genericBNLearner::useMIIC ( )

indicate that we wish to use MIIC

Definition at line 230 of file genericBNLearner_inl.h.

References __miic_3off2, __selected_algo, MIIC_THREE_OFF_TWO, and gum::learning::Miic::setMiicBehaviour().

230  {
233  }
AlgoType __selected_algo
the selected learning algorithm
Miic __miic_3off2
the 3off2 algorithm
void setMiicBehaviour()
Sets the orientation phase to follow the one of the MIIC algorithm.
Definition: Miic.cpp:1064
+ Here is the call graph for this function:

◆ useNML()

INLINE void gum::learning::genericBNLearner::useNML ( )

indicate that we wish to use the NML correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 236 of file genericBNLearner_inl.h.

References __3off2_kmode, __selected_algo, GUM_ERROR, and MIIC_THREE_OFF_TWO.

236  {
238  GUM_ERROR(OperationNotAllowed,
239  "You must use the 3off2 algorithm before selecting "
240  << "the NML score");
241  }
243  }
AlgoType __selected_algo
the selected learning algorithm
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55

◆ useNoApriori()

INLINE void gum::learning::genericBNLearner::useNoApriori ( )

use no apriori

Definition at line 461 of file genericBNLearner_inl.h.

References __apriori_type, checkScoreAprioriCompatibility(), and NO_APRIORI.

461  {
464  }
AprioriType __apriori_type
the a priori selected for the score and parameters
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useNoCorr()

INLINE void gum::learning::genericBNLearner::useNoCorr ( )

indicate that we wish to use the NoCorr correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 256 of file genericBNLearner_inl.h.

References __3off2_kmode, __selected_algo, GUM_ERROR, and MIIC_THREE_OFF_TWO.

256  {
258  GUM_ERROR(OperationNotAllowed,
259  "You must use the 3off2 algorithm before selecting "
260  << "the NoCorr score");
261  }
263  }
AlgoType __selected_algo
the selected learning algorithm
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55

◆ useScoreAIC()

INLINE void gum::learning::genericBNLearner::useScoreAIC ( )

indicate that we wish to use an AIC score

Definition at line 183 of file genericBNLearner_inl.h.

References __score_type, AIC, and checkScoreAprioriCompatibility().

183  {
186  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning
+ Here is the call graph for this function:

◆ useScoreBD()

INLINE void gum::learning::genericBNLearner::useScoreBD ( )

indicate that we wish to use a BD score

Definition at line 189 of file genericBNLearner_inl.h.

References __score_type, BD, and checkScoreAprioriCompatibility().

189  {
192  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning
+ Here is the call graph for this function:

◆ useScoreBDeu()

INLINE void gum::learning::genericBNLearner::useScoreBDeu ( )

indicate that we wish to use a BDeu score

Definition at line 195 of file genericBNLearner_inl.h.

References __score_type, BDeu, and checkScoreAprioriCompatibility().

195  {
198  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning
+ Here is the call graph for this function:

◆ useScoreBIC()

INLINE void gum::learning::genericBNLearner::useScoreBIC ( )

indicate that we wish to use a BIC score

Definition at line 201 of file genericBNLearner_inl.h.

References __score_type, BIC, and checkScoreAprioriCompatibility().

201  {
204  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning
+ Here is the call graph for this function:

◆ useScoreK2()

INLINE void gum::learning::genericBNLearner::useScoreK2 ( )

indicate that we wish to use a K2 score

Definition at line 207 of file genericBNLearner_inl.h.

References __score_type, checkScoreAprioriCompatibility(), and K2.

207  {
210  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning
+ Here is the call graph for this function:

◆ useScoreLog2Likelihood()

INLINE void gum::learning::genericBNLearner::useScoreLog2Likelihood ( )

indicate that we wish to use a Log2Likelihood score

Definition at line 213 of file genericBNLearner_inl.h.

References __score_type, checkScoreAprioriCompatibility(), and LOG2LIKELIHOOD.

213  {
216  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning
+ Here is the call graph for this function:

◆ verbosity()

bool gum::learning::genericBNLearner::verbosity ( ) const
inlinevirtual

verbosity

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1092 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::verbosity().

1092  {
1093  if (__current_algorithm != nullptr)
1094  return __current_algorithm->verbosity();
1095  else
1096  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1097  };
const ApproximationScheme * __current_algorithm
bool verbosity() const
Returns true if verbosity is enabled.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:55
+ Here is the call graph for this function:

Member Data Documentation

◆ __3off2_kmode

CorrectedMutualInformation ::KModeTypes gum::learning::genericBNLearner::__3off2_kmode
protected
Initial value:

the penalty used in 3off2

Definition at line 784 of file genericBNLearner.h.

Referenced by __createCorrectedMutualInformation(), operator=(), useMDL(), useNML(), and useNoCorr().

◆ __apriori

Apriori* gum::learning::genericBNLearner::__apriori {nullptr}
protected

◆ __apriori_database

Database* gum::learning::genericBNLearner::__apriori_database {nullptr}
protected

the database used by the Dirichlet a priori

Definition at line 803 of file genericBNLearner.h.

Referenced by __createApriori(), __learnDAG(), operator=(), and ~genericBNLearner().

◆ __apriori_dbname

std::string gum::learning::genericBNLearner::__apriori_dbname
protected

the filename for the Dirichlet a priori, if any

Definition at line 806 of file genericBNLearner.h.

Referenced by __createApriori(), operator=(), and useAprioriDirichlet().

◆ __apriori_type

AprioriType gum::learning::genericBNLearner::__apriori_type {AprioriType::NO_APRIORI}
protected

the a priori selected for the score and parameters

Definition at line 746 of file genericBNLearner.h.

Referenced by __createApriori(), __getAprioriType(), __learnDAG(), operator=(), useAprioriBDeu(), useAprioriDirichlet(), useAprioriSmoothing(), and useNoApriori().

◆ __apriori_weight

double gum::learning::genericBNLearner::__apriori_weight {1.0f}
protected

the weight of the apriori

Definition at line 754 of file genericBNLearner.h.

Referenced by __createApriori(), __setAprioriWeight(), checkScoreAprioriCompatibility(), and operator=().

◆ __constraint_ForbiddenArcs

StructuralConstraintForbiddenArcs gum::learning::genericBNLearner::__constraint_ForbiddenArcs
protected

the constraint on forbidden arcs

Definition at line 766 of file genericBNLearner.h.

Referenced by __learnDAG(), __prepare_miic_3off2(), addForbiddenArc(), eraseForbiddenArc(), operator=(), and setForbiddenArcs().

◆ __constraint_Indegree

StructuralConstraintIndegree gum::learning::genericBNLearner::__constraint_Indegree
protected

the constraint for indegrees

Definition at line 760 of file genericBNLearner.h.

Referenced by __learnDAG(), operator=(), and setMaxIndegree().

◆ __constraint_MandatoryArcs

StructuralConstraintMandatoryArcs gum::learning::genericBNLearner::__constraint_MandatoryArcs
protected

the constraint on forbidden arcs

Definition at line 772 of file genericBNLearner.h.

Referenced by __learnDAG(), __prepare_miic_3off2(), addMandatoryArc(), eraseMandatoryArc(), operator=(), and setMandatoryArcs().

◆ __constraint_PossibleEdges

StructuralConstraintPossibleEdges gum::learning::genericBNLearner::__constraint_PossibleEdges
protected

the constraint on possible Edges

Definition at line 769 of file genericBNLearner.h.

Referenced by __learnDAG(), addPossibleEdge(), erasePossibleEdge(), and setPossibleEdges().

◆ __constraint_SliceOrder

StructuralConstraintSliceOrder gum::learning::genericBNLearner::__constraint_SliceOrder
protected

the constraint for 2TBNs

Definition at line 757 of file genericBNLearner.h.

Referenced by __learnDAG(), operator=(), and setSliceOrder().

◆ __constraint_TabuList

StructuralConstraintTabuList gum::learning::genericBNLearner::__constraint_TabuList
protected

the constraint for tabu lists

Definition at line 763 of file genericBNLearner.h.

Referenced by __learnDAG(), operator=(), and useLocalSearchWithTabuList().

◆ __current_algorithm

◆ __Dag2BN

DAG2BNLearner gum::learning::genericBNLearner::__Dag2BN
protected

the parametric EM

Definition at line 788 of file genericBNLearner.h.

◆ __EMepsilon

double gum::learning::genericBNLearner::__EMepsilon {0.0}
protected

epsilon for EM. if espilon=0.0 : no EM

Definition at line 740 of file genericBNLearner.h.

Referenced by operator=(), and useEM().

◆ __greedy_hill_climbing

GreedyHillClimbing gum::learning::genericBNLearner::__greedy_hill_climbing
protected

the greedy hill climbing algorithm

Definition at line 791 of file genericBNLearner.h.

Referenced by __learnDAG(), and operator=().

◆ __initial_dag

DAG gum::learning::genericBNLearner::__initial_dag
protected

an initial DAG given to learners

Definition at line 809 of file genericBNLearner.h.

Referenced by __learnDAG(), operator=(), and setInitialDAG().

◆ __K2

K2 gum::learning::genericBNLearner::__K2
protected

the K2 algorithm

Definition at line 778 of file genericBNLearner.h.

Referenced by __learnDAG(), operator=(), and useK2().

◆ __local_search_with_tabu_list

LocalSearchWithTabuList gum::learning::genericBNLearner::__local_search_with_tabu_list
protected

the local search with tabu list algorithm

Definition at line 794 of file genericBNLearner.h.

Referenced by __learnDAG(), operator=(), and useLocalSearchWithTabuList().

◆ __miic_3off2

Miic gum::learning::genericBNLearner::__miic_3off2
protected

the 3off2 algorithm

Definition at line 781 of file genericBNLearner.h.

Referenced by __learnDAG(), __prepare_miic_3off2(), latentVariables(), learnMixedStructure(), operator=(), use3off2(), and useMIIC().

◆ __mutual_info

CorrectedMutualInformation* gum::learning::genericBNLearner::__mutual_info {nullptr}
protected

the selected correction for 3off2 and miic

Definition at line 743 of file genericBNLearner.h.

Referenced by __createCorrectedMutualInformation(), __learnDAG(), learnMixedStructure(), operator=(), and ~genericBNLearner().

◆ __no_apriori

AprioriNoApriori* gum::learning::genericBNLearner::__no_apriori {nullptr}
protected

◆ __param_estimator_type

ParamEstimatorType gum::learning::genericBNLearner::__param_estimator_type {ParamEstimatorType::ML}
protected

the type of the parameter estimator

Definition at line 737 of file genericBNLearner.h.

Referenced by __createParamEstimator(), and operator=().

◆ __ranges

std::vector< std::pair< std::size_t, std::size_t > > gum::learning::genericBNLearner::__ranges
protected

the set of rows' ranges within the database in which learning is done

Definition at line 800 of file genericBNLearner.h.

Referenced by __createCorrectedMutualInformation(), __createParamEstimator(), __createScore(), clearDatabaseRanges(), databaseRanges(), operator=(), useCrossValidationFold(), and useDatabaseRanges().

◆ __score

Score* gum::learning::genericBNLearner::__score {nullptr}
protected

the score used

Definition at line 734 of file genericBNLearner.h.

Referenced by __createParamEstimator(), __createScore(), __learnDAG(), operator=(), and ~genericBNLearner().

◆ __score_database

◆ __score_type

ScoreType gum::learning::genericBNLearner::__score_type {ScoreType::BDeu}
protected

◆ __selected_algo

AlgoType gum::learning::genericBNLearner::__selected_algo {AlgoType::GREEDY_HILL_CLIMBING}
protected

◆ onProgress

◆ onStop

Signaler1< std::string > gum::IApproximationSchemeConfiguration::onStop
inherited

Criteria messageApproximationScheme.

Definition at line 62 of file IApproximationSchemeConfiguration.h.

Referenced by gum::ApproximationScheme::_stopScheme(), and distributeStop().


The documentation for this class was generated from the following files: