aGrUM  0.14.1
gum::learning::genericBNLearner Class Reference

A pack of learning algorithms that can easily be used. More...

#include <genericBNLearner.h>

+ Inheritance diagram for gum::learning::genericBNLearner:
+ Collaboration diagram for gum::learning::genericBNLearner:

Public Attributes

Signaler3< Size, double, doubleonProgress
 Progression, error and time. More...
 
Signaler1< std::string > onStop
 Criteria messageApproximationScheme. More...
 

Public Member Functions

void setMandatoryArcs (const ArcSet &set)
 assign a set of forbidden arcs More...
 
Constructors / Destructors
 genericBNLearner (const std::string &filename, const std::vector< std::string > &missing_symbols)
 default constructor More...
 
 genericBNLearner (const DatabaseTable<> &db)
 default constructor More...
 
template<typename GUM_SCALAR >
 genericBNLearner (const std::string &filename, const gum::BayesNet< GUM_SCALAR > &src, const std::vector< std::string > &missing_symbols)
 read the database file for the score / parameter estimation and var names More...
 
 genericBNLearner (const genericBNLearner &)
 copy constructor More...
 
 genericBNLearner (genericBNLearner &&)
 move constructor More...
 
virtual ~genericBNLearner ()
 destructor More...
 
Operators
genericBNLearneroperator= (const genericBNLearner &)
 copy operator More...
 
genericBNLearneroperator= (genericBNLearner &&)
 move operator More...
 
Accessors / Modifiers
DAG learnDAG ()
 learn a structure from a file (must have read the db before) More...
 
MixedGraph learnMixedStructure ()
 learn a partial structure from a file (must have read the db before and must have selected miic or 3off2) More...
 
void setInitialDAG (const DAG &)
 sets an initial DAG structure More...
 
const std::vector< std::string > & names () const
 returns the names of the variables in the database More...
 
const std::vector< std::size_t > & domainSizes () const
 returns the domain sizes of the variables in the database More...
 
NodeId idFromName (const std::string &var_name) const
 returns the node id corresponding to a variable name More...
 
const DatabaseTabledatabase () const
 returns the database used by the BNLearner More...
 
void setDatabaseWeight (const double new_weight)
 assign a weight to all the rows of the learning database so that the sum of their weights is equal to new_weight More...
 
const std::string & nameFromId (NodeId id) const
 returns the variable name corresponding to a given node id More...
 
template<template< typename > class XALLOC>
void useDatabaseRanges (const std::vector< std::pair< std::size_t, std::size_t >, XALLOC< std::pair< std::size_t, std::size_t > > > &new_ranges)
 use a new set of database rows' ranges to perform learning More...
 
void clearDatabaseRanges ()
 reset the ranges to the one range corresponding to the whole database More...
 
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges () const
 returns the current database rows' ranges used for learning More...
 
std::pair< std::size_t, std::size_t > useCrossValidationFold (const std::size_t learning_fold, const std::size_t k_fold)
 sets the ranges of rows to be used for cross-validation learning More...
 
std::pair< double, doublechi2 (const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
 Return the <statistic,pvalue> pair for the BNLearner. More...
 
std::pair< double, doublechi2 (const std::string &name1, const std::string &name2, const std::vector< std::string > &knowing={})
 Return the <statistic,pvalue> pair for the BNLearner. More...
 
double logLikelihood (const std::vector< NodeId > &vars, const std::vector< NodeId > &knowing={})
 Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner. More...
 
double logLikelihood (const std::vector< std::string > &vars, const std::vector< std::string > &knowing={})
 Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner. More...
 
Size nbCols () const
 
Size nbRows () const
 
void useEM (const double epsilon)
 use The EM algorithm to learn paramters More...
 
bool hasMissingValues () const
 returns true if the learner's database has missing values More...
 
Score selection
void useScoreAIC ()
 indicate that we wish to use an AIC score More...
 
void useScoreBD ()
 indicate that we wish to use a BD score More...
 
void useScoreBDeu ()
 indicate that we wish to use a BDeu score More...
 
void useScoreBIC ()
 indicate that we wish to use a BIC score More...
 
void useScoreK2 ()
 indicate that we wish to use a K2 score More...
 
void useScoreLog2Likelihood ()
 indicate that we wish to use a Log2Likelihood score More...
 
A priori selection / parameterization
void setAprioriWeight (double weight)
 sets the apriori weight More...
 
void useNoApriori ()
 use no apriori More...
 
void useAprioriBDeu (double weight=1)
 use the BDeu apriori More...
 
void useAprioriSmoothing (double weight=1)
 use the apriori smoothing More...
 
void useAprioriDirichlet (const std::string &filename, double weight=1)
 use the Dirichlet apriori More...
 
std::string checkScoreAprioriCompatibility ()
 checks whether the current score and apriori are compatible More...
 
Learning algorithm selection
void useGreedyHillClimbing ()
 indicate that we wish to use a greedy hill climbing algorithm More...
 
void useLocalSearchWithTabuList (Size tabu_size=100, Size nb_decrease=2)
 indicate that we wish to use a local search with tabu list More...
 
void useK2 (const Sequence< NodeId > &order)
 indicate that we wish to use K2 More...
 
void useK2 (const std::vector< NodeId > &order)
 indicate that we wish to use K2 More...
 
void use3off2 ()
 indicate that we wish to use 3off2 More...
 
void useMIIC ()
 indicate that we wish to use MIIC More...
 
3off2/MIIC parameterization and specific results
void useNML ()
 indicate that we wish to use the NML correction for 3off2 More...
 
void useMDL ()
 indicate that we wish to use the MDL correction for 3off2 More...
 
void useNoCorr ()
 indicate that we wish to use the NoCorr correction for 3off2 More...
 
const std::vector< ArclatentVariables () const
 get the list of arcs hiding latent variables More...
 
Accessors / Modifiers for adding constraints on learning
void setMaxIndegree (Size max_indegree)
 sets the max indegree More...
 
void setSliceOrder (const NodeProperty< NodeId > &slice_order)
 sets a partial order on the nodes More...
 
void setSliceOrder (const std::vector< std::vector< std::string > > &slices)
 sets a partial order on the nodes More...
 
void setForbiddenArcs (const ArcSet &set)
 assign a set of forbidden arcs More...
 
assign a new forbidden arc
void addForbiddenArc (const Arc &arc)
 
void addForbiddenArc (const NodeId tail, const NodeId head)
 
void addForbiddenArc (const std::string &tail, const std::string &head)
 
void addMandatoryArc (const Arc &arc)
 
void addMandatoryArc (const NodeId tail, const NodeId head)
 
void addMandatoryArc (const std::string &tail, const std::string &head)
 
remove a forbidden arc
void eraseForbiddenArc (const Arc &arc)
 
void eraseForbiddenArc (const NodeId tail, const NodeId head)
 
void eraseForbiddenArc (const std::string &tail, const std::string &head)
 
void eraseMandatoryArc (const Arc &arc)
 
void eraseMandatoryArc (const NodeId tail, const NodeId head)
 
void eraseMandatoryArc (const std::string &tail, const std::string &head)
 
redistribute signals AND implemenation of interface
INLINE void setCurrentApproximationScheme (const ApproximationScheme *approximationScheme)
 {@ /// distribute signals More...
 
INLINE void distributeProgress (const ApproximationScheme *approximationScheme, Size pourcent, double error, double time)
 {@ /// distribute signals More...
 
INLINE void distributeStop (const ApproximationScheme *approximationScheme, std::string message)
 distribute signals More...
 
void setEpsilon (double eps)
 Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)| If the criterion was disabled it will be enabled. More...
 
double epsilon () const
 Get the value of epsilon. More...
 
void disableEpsilon ()
 Disable stopping criterion on epsilon. More...
 
void enableEpsilon ()
 Enable stopping criterion on epsilon. More...
 
bool isEnabledEpsilon () const
 
void setMinEpsilonRate (double rate)
 Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|) If the criterion was disabled it will be enabled. More...
 
double minEpsilonRate () const
 Get the value of the minimal epsilon rate. More...
 
void disableMinEpsilonRate ()
 Disable stopping criterion on epsilon rate. More...
 
void enableMinEpsilonRate ()
 Enable stopping criterion on epsilon rate. More...
 
bool isEnabledMinEpsilonRate () const
 
void setMaxIter (Size max)
 stopping criterion on number of iterationsIf the criterion was disabled it will be enabled More...
 
Size maxIter () const
 
void disableMaxIter ()
 Disable stopping criterion on max iterations. More...
 
void enableMaxIter ()
 Enable stopping criterion on max iterations. More...
 
bool isEnabledMaxIter () const
 
void setMaxTime (double timeout)
 stopping criterion on timeout If the criterion was disabled it will be enabled More...
 
double maxTime () const
 returns the timeout (in seconds) More...
 
double currentTime () const
 get the current running time in second (double) More...
 
void disableMaxTime ()
 Disable stopping criterion on timeout. More...
 
void enableMaxTime ()
 stopping criterion on timeout If the criterion was disabled it will be enabled More...
 
bool isEnabledMaxTime () const
 
void setPeriodSize (Size p)
 how many samples between 2 stopping isEnableds More...
 
Size periodSize () const
 how many samples between 2 stopping isEnableds More...
 
void setVerbosity (bool v)
 verbosity More...
 
bool verbosity () const
 verbosity More...
 
ApproximationSchemeSTATE stateApproximationScheme () const
 history More...
 
Size nbrIterations () const
 
const std::vector< double > & history () const
 
Getters and setters
std::string messageApproximationScheme () const
 Returns the approximation scheme message. More...
 

Public Types

enum  ScoreType {
  ScoreType::AIC, ScoreType::BD, ScoreType::BDeu, ScoreType::BIC,
  ScoreType::K2, ScoreType::LOG2LIKELIHOOD
}
 an enumeration enabling to select easily the score we wish to use More...
 
enum  ParamEstimatorType { ParamEstimatorType::ML }
 an enumeration to select the type of parameter estimation we shall apply More...
 
enum  AprioriType { AprioriType::NO_APRIORI, AprioriType::SMOOTHING, AprioriType::DIRICHLET_FROM_DATABASE, AprioriType::BDEU }
 an enumeration to select the apriori More...
 
enum  AlgoType { AlgoType::K2, AlgoType::GREEDY_HILL_CLIMBING, AlgoType::LOCAL_SEARCH_WITH_TABU_LIST, AlgoType::MIIC_THREE_OFF_TWO }
 an enumeration to select easily the learning algorithm to use More...
 
enum  ApproximationSchemeSTATE : char {
  ApproximationSchemeSTATE::Undefined, ApproximationSchemeSTATE::Continue, ApproximationSchemeSTATE::Epsilon, ApproximationSchemeSTATE::Rate,
  ApproximationSchemeSTATE::Limit, ApproximationSchemeSTATE::TimeLimit, ApproximationSchemeSTATE::Stopped
}
 The different state of an approximation scheme. More...
 

Protected Attributes

ScoreType __score_type {ScoreType::BDeu}
 the score selected for learning More...
 
Score__score {nullptr}
 the score used More...
 
ParamEstimatorType __param_estimator_type {ParamEstimatorType::ML}
 the type of the parameter estimator More...
 
double __EMepsilon {0.0}
 epsilon for EM. if espilon=0.0 : no EM More...
 
CorrectedMutualInformation__mutual_info {nullptr}
 the selected correction for 3off2 and miic More...
 
AprioriType __apriori_type {AprioriType::NO_APRIORI}
 the a priori selected for the score and parameters More...
 
Apriori__apriori {nullptr}
 the apriori used More...
 
AprioriNoApriori__no_apriori {nullptr}
 
double __apriori_weight {1.0f}
 the weight of the apriori More...
 
StructuralConstraintSliceOrder __constraint_SliceOrder
 the constraint for 2TBNs More...
 
StructuralConstraintIndegree __constraint_Indegree
 the constraint for indegrees More...
 
StructuralConstraintTabuList __constraint_TabuList
 the constraint for tabu lists More...
 
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
 the constraint on forbidden arcs More...
 
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
 the constraint on forbidden arcs More...
 
AlgoType __selected_algo {AlgoType::GREEDY_HILL_CLIMBING}
 the selected learning algorithm More...
 
K2 __K2
 the K2 algorithm More...
 
Miic __miic_3off2
 the 3off2 algorithm More...
 
CorrectedMutualInformation ::KModeTypes __3off2_kmode
 the penalty used in 3off2 More...
 
DAG2BNLearner __Dag2BN
 the parametric EM More...
 
GreedyHillClimbing __greedy_hill_climbing
 the greedy hill climbing algorithm More...
 
LocalSearchWithTabuList __local_search_with_tabu_list
 the local search with tabu list algorithm More...
 
Database __score_database
 the database to be used by the scores and parameter estimators More...
 
std::vector< std::pair< std::size_t, std::size_t > > __ranges
 the set of rows' ranges within the database in which learning is done More...
 
Database__apriori_database {nullptr}
 the database used by the Dirichlet a priori More...
 
std::string __apriori_dbname
 the filename for the Dirichlet a priori, if any More...
 
DAG __initial_dag
 an initial DAG given to learners More...
 
const ApproximationScheme__current_algorithm {nullptr}
 

Protected Member Functions

void __createApriori ()
 create the apriori used for learning More...
 
void __createScore ()
 create the score used for learning More...
 
ParamEstimator__createParamEstimator (DBRowGeneratorParser<> &parser, bool take_into_account_score=true)
 create the parameter estimator used for learning More...
 
DAG __learnDAG ()
 returns the DAG learnt More...
 
MixedGraph __prepare_miic_3off2 ()
 prepares the initial graph for 3off2 or miic More...
 
const std::string & __getAprioriType () const
 returns the type (as a string) of a given apriori More...
 
void __createCorrectedMutualInformation ()
 create the Corrected Mutual Information instance for Miic/3off2 More...
 

Static Protected Member Functions

static DatabaseTable __readFile (const std::string &filename, const std::vector< std::string > &missing_symbols)
 reads a file and returns a databaseVectInRam More...
 
static void __checkFileName (const std::string &filename)
 checks whether the extension of a CSV filename is correct More...
 

Classes

class  Database
 a helper to easily read databases More...
 

Detailed Description

A pack of learning algorithms that can easily be used.

The pack currently contains K2, GreedyHillClimbing and LocalSearchWithTabuList also 3off2/miic

Definition at line 103 of file genericBNLearner.h.

Member Enumeration Documentation

◆ AlgoType

an enumeration to select easily the learning algorithm to use

Enumerator
K2 
GREEDY_HILL_CLIMBING 
LOCAL_SEARCH_WITH_TABU_LIST 
MIIC_THREE_OFF_TWO 

Definition at line 122 of file genericBNLearner.h.

122  {
123  K2,
124  GREEDY_HILL_CLIMBING,
125  LOCAL_SEARCH_WITH_TABU_LIST,
126  MIIC_THREE_OFF_TWO
127  };

◆ ApproximationSchemeSTATE

The different state of an approximation scheme.

Enumerator
Undefined 
Continue 
Epsilon 
Rate 
Limit 
TimeLimit 
Stopped 

Definition at line 63 of file IApproximationSchemeConfiguration.h.

63  : char {
64  Undefined,
65  Continue,
66  Epsilon,
67  Rate,
68  Limit,
69  TimeLimit,
70  Stopped
71  };

◆ AprioriType

an enumeration to select the apriori

Enumerator
NO_APRIORI 
SMOOTHING 
DIRICHLET_FROM_DATABASE 
BDEU 

Definition at line 114 of file genericBNLearner.h.

114  {
115  NO_APRIORI,
116  SMOOTHING,
117  DIRICHLET_FROM_DATABASE,
118  BDEU
119  };

◆ ParamEstimatorType

an enumeration to select the type of parameter estimation we shall apply

Enumerator
ML 

Definition at line 111 of file genericBNLearner.h.

111 { ML };

◆ ScoreType

an enumeration enabling to select easily the score we wish to use

Enumerator
AIC 
BD 
BDeu 
BIC 
K2 
LOG2LIKELIHOOD 

Definition at line 107 of file genericBNLearner.h.

107 { AIC, BD, BDeu, BIC, K2, LOG2LIKELIHOOD };

Constructor & Destructor Documentation

◆ genericBNLearner() [1/5]

gum::learning::genericBNLearner::genericBNLearner ( const std::string &  filename,
const std::vector< std::string > &  missing_symbols 
)

default constructor

read the database file for the score / parameter estimation and var names

Definition at line 187 of file genericBNLearner.cpp.

References __no_apriori, __score_database, and gum::learning::genericBNLearner::Database::databaseTable().

189  :
190  __score_database(filename, missing_symbols) {
191  __no_apriori = new AprioriNoApriori<>(__score_database.databaseTable());
192 
193  // for debugging purposes
194  GUM_CONSTRUCTOR(genericBNLearner);
195  }
Database __score_database
the database to be used by the scores and parameter estimators
genericBNLearner(const std::string &filename, const std::vector< std::string > &missing_symbols)
default constructor
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:

◆ genericBNLearner() [2/5]

gum::learning::genericBNLearner::genericBNLearner ( const DatabaseTable<> &  db)

default constructor

read the database file for the score / parameter estimation and var names

Definition at line 198 of file genericBNLearner.cpp.

References __no_apriori, __score_database, and gum::learning::genericBNLearner::Database::databaseTable().

198  :
199  __score_database(db) {
200  __no_apriori = new AprioriNoApriori<>(__score_database.databaseTable());
201 
202  // for debugging purposes
203  GUM_CONSTRUCTOR(genericBNLearner);
204  }
Database __score_database
the database to be used by the scores and parameter estimators
genericBNLearner(const std::string &filename, const std::vector< std::string > &missing_symbols)
default constructor
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:

◆ genericBNLearner() [3/5]

template<typename GUM_SCALAR >
gum::learning::genericBNLearner::genericBNLearner ( const std::string &  filename,
const gum::BayesNet< GUM_SCALAR > &  src,
const std::vector< std::string > &  missing_symbols 
)

read the database file for the score / parameter estimation and var names

Parameters
filenameThe file to learn from.
modalitiesindicate for some nodes (not necessarily all the nodes of the BN) which modalities they should have and in which order these modalities should be stored into the nodes. For instance, if modalities = { 1 -> {True, False, Big} }, then the node of id 1 in the BN will have 3 modalities, the first one being True, the second one being False, and the third bein Big.
parse_databaseif true, the modalities specified by the user will be considered as a superset of the modalities of the variables. A parsing of the database will allow to determine which ones are really necessary and will keep them in the order specified by the user (NodeProperty modalities). If parse_database is set to false (the default), then the modalities specified by the user will be considered as being exactly those of the variables of the BN (as a consequence, if we find other values in the database, an exception will be raised during learning).

Definition at line 88 of file genericBNLearner_tpl.h.

References __no_apriori, __score_database, and gum::learning::genericBNLearner::Database::databaseTable().

91  :
92  __score_database(filename, bn, missing_symbols) {
93  __no_apriori = new AprioriNoApriori<>(__score_database.databaseTable());
94  GUM_CONSTRUCTOR(genericBNLearner);
95  }
Database __score_database
the database to be used by the scores and parameter estimators
genericBNLearner(const std::string &filename, const std::vector< std::string > &missing_symbols)
default constructor
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:

◆ genericBNLearner() [4/5]

gum::learning::genericBNLearner::genericBNLearner ( const genericBNLearner from)

copy constructor

Definition at line 207 of file genericBNLearner.cpp.

References __no_apriori, __score_database, and gum::learning::genericBNLearner::Database::databaseTable().

207  :
208  __score_type(from.__score_type),
209  __param_estimator_type(from.__param_estimator_type),
210  __EMepsilon(from.__EMepsilon), __apriori_type(from.__apriori_type),
211  __apriori_weight(from.__apriori_weight),
212  __constraint_SliceOrder(from.__constraint_SliceOrder),
213  __constraint_Indegree(from.__constraint_Indegree),
214  __constraint_TabuList(from.__constraint_TabuList),
215  __constraint_ForbiddenArcs(from.__constraint_ForbiddenArcs),
216  __constraint_MandatoryArcs(from.__constraint_MandatoryArcs),
217  __selected_algo(from.__selected_algo), __K2(from.__K2),
218  __miic_3off2(from.__miic_3off2), __3off2_kmode(from.__3off2_kmode),
219  __greedy_hill_climbing(from.__greedy_hill_climbing),
220  __local_search_with_tabu_list(from.__local_search_with_tabu_list),
221  __score_database(from.__score_database), __ranges(from.__ranges),
222  __apriori_dbname(from.__apriori_dbname),
223  __initial_dag(from.__initial_dag) {
224  __no_apriori = new AprioriNoApriori<>(__score_database.databaseTable());
225 
226  // for debugging purposes
227  GUM_CONS_CPY(genericBNLearner);
228  }
AlgoType __selected_algo
the selected learning algorithm
Database __score_database
the database to be used by the scores and parameter estimators
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
double __EMepsilon
epsilon for EM. if espilon=0.0 : no EM
StructuralConstraintSliceOrder __constraint_SliceOrder
the constraint for 2TBNs
Miic __miic_3off2
the 3off2 algorithm
ParamEstimatorType __param_estimator_type
the type of the parameter estimator
AprioriType __apriori_type
the a priori selected for the score and parameters
DAG __initial_dag
an initial DAG given to learners
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
genericBNLearner(const std::string &filename, const std::vector< std::string > &missing_symbols)
default constructor
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
StructuralConstraintTabuList __constraint_TabuList
the constraint for tabu lists
std::string __apriori_dbname
the filename for the Dirichlet a priori, if any
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
const DatabaseTable & databaseTable() const
returns the internal database table
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
StructuralConstraintIndegree __constraint_Indegree
the constraint for indegrees
ScoreType __score_type
the score selected for learning
double __apriori_weight
the weight of the apriori
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ genericBNLearner() [5/5]

gum::learning::genericBNLearner::genericBNLearner ( genericBNLearner &&  from)

move constructor

Definition at line 230 of file genericBNLearner.cpp.

References __no_apriori, __score_database, and gum::learning::genericBNLearner::Database::databaseTable().

230  :
231  __score_type(from.__score_type),
232  __param_estimator_type(from.__param_estimator_type),
233  __EMepsilon(from.__EMepsilon), __apriori_type(from.__apriori_type),
234  __apriori_weight(from.__apriori_weight),
235  __constraint_SliceOrder(std::move(from.__constraint_SliceOrder)),
236  __constraint_Indegree(std::move(from.__constraint_Indegree)),
237  __constraint_TabuList(std::move(from.__constraint_TabuList)),
238  __constraint_ForbiddenArcs(std::move(from.__constraint_ForbiddenArcs)),
239  __constraint_MandatoryArcs(std::move(from.__constraint_MandatoryArcs)),
240  __selected_algo(from.__selected_algo), __K2(std::move(from.__K2)),
241  __miic_3off2(std::move(from.__miic_3off2)),
242  __3off2_kmode(from.__3off2_kmode),
243  __greedy_hill_climbing(std::move(from.__greedy_hill_climbing)),
245  std::move(from.__local_search_with_tabu_list)),
246  __score_database(std::move(from.__score_database)),
247  __ranges(std::move(from.__ranges)),
248  __apriori_dbname(std::move(from.__apriori_dbname)),
249  __initial_dag(std::move(from.__initial_dag)) {
250  __no_apriori = new AprioriNoApriori<>(__score_database.databaseTable());
251 
252  // for debugging purposes
253  GUM_CONS_MOV(genericBNLearner);
254  }
AlgoType __selected_algo
the selected learning algorithm
Database __score_database
the database to be used by the scores and parameter estimators
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
double __EMepsilon
epsilon for EM. if espilon=0.0 : no EM
StructuralConstraintSliceOrder __constraint_SliceOrder
the constraint for 2TBNs
Miic __miic_3off2
the 3off2 algorithm
ParamEstimatorType __param_estimator_type
the type of the parameter estimator
AprioriType __apriori_type
the a priori selected for the score and parameters
DAG __initial_dag
an initial DAG given to learners
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
genericBNLearner(const std::string &filename, const std::vector< std::string > &missing_symbols)
default constructor
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
StructuralConstraintTabuList __constraint_TabuList
the constraint for tabu lists
std::string __apriori_dbname
the filename for the Dirichlet a priori, if any
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
const DatabaseTable & databaseTable() const
returns the internal database table
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
StructuralConstraintIndegree __constraint_Indegree
the constraint for indegrees
ScoreType __score_type
the score selected for learning
double __apriori_weight
the weight of the apriori
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ ~genericBNLearner()

gum::learning::genericBNLearner::~genericBNLearner ( )
virtual

destructor

Definition at line 256 of file genericBNLearner.cpp.

References __apriori, __apriori_database, __mutual_info, __no_apriori, and __score.

256  {
257  if (__score) delete __score;
258 
259  if (__apriori) delete __apriori;
260 
261  if (__no_apriori) delete __no_apriori;
262 
264 
265  if (__mutual_info) delete __mutual_info;
266 
267  GUM_DESTRUCTOR(genericBNLearner);
268  }
Score * __score
the score used
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
genericBNLearner(const std::string &filename, const std::vector< std::string > &missing_symbols)
default constructor
Database * __apriori_database
the database used by the Dirichlet a priori
Apriori * __apriori
the apriori used

Member Function Documentation

◆ __checkFileName()

void gum::learning::genericBNLearner::__checkFileName ( const std::string &  filename)
staticprotected

checks whether the extension of a CSV filename is correct

Definition at line 407 of file genericBNLearner.cpp.

References GUM_ERROR.

Referenced by __readFile(), and gum::learning::genericBNLearner::Database::Database().

407  {
408  // get the extension of the file
409  Size filename_size = Size(filename.size());
410 
411  if (filename_size < 4) {
412  GUM_ERROR(FormatNotFound,
413  "genericBNLearner could not determine the "
414  "file type of the database");
415  }
416 
417  std::string extension = filename.substr(filename.size() - 4);
418  std::transform(
419  extension.begin(), extension.end(), extension.begin(), ::tolower);
420 
421  if (extension != ".csv") {
422  GUM_ERROR(
423  OperationNotAllowed,
424  "genericBNLearner does not support yet this type of database file");
425  }
426  }
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:45
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the caller graph for this function:

◆ __createApriori()

void gum::learning::genericBNLearner::__createApriori ( )
protected

create the apriori used for learning

Definition at line 456 of file genericBNLearner.cpp.

References __apriori, __apriori_database, __apriori_dbname, __apriori_type, __apriori_weight, __score_database, BDEU, gum::learning::genericBNLearner::Database::databaseTable(), DIRICHLET_FROM_DATABASE, GUM_ERROR, gum::learning::genericBNLearner::Database::missingSymbols(), NO_APRIORI, gum::learning::genericBNLearner::Database::nodeId2Columns(), gum::learning::genericBNLearner::Database::parser(), gum::learning::Apriori< ALLOC >::setWeight(), and SMOOTHING.

Referenced by chi2(), learnDAG(), and logLikelihood().

456  {
457  // first, save the old apriori, to be delete if everything is ok
458  Apriori<>* old_apriori = __apriori;
459 
460  // create the new apriori
461  switch (__apriori_type) {
463  __apriori = new AprioriNoApriori<>(__score_database.databaseTable(),
465  break;
466 
468  __apriori = new AprioriSmoothing<>(__score_database.databaseTable(),
470  break;
471 
473  if (__apriori_database != nullptr) {
474  delete __apriori_database;
475  __apriori_database = nullptr;
476  }
477 
478  __apriori_database = new Database(__apriori_dbname,
481 
482  __apriori = new AprioriDirichletFromDatabase<>(
486  break;
487 
488  case AprioriType::BDEU:
489  __apriori = new AprioriBDeu<>(__score_database.databaseTable(),
491  break;
492 
493  default:
494  GUM_ERROR(OperationNotAllowed,
495  "The BNLearner does not support yet this apriori");
496  }
497 
498  // do not forget to assign a weight to the apriori
499  __apriori->setWeight(__apriori_weight);
500 
501  // remove the old apriori, if any
502  if (old_apriori != nullptr) delete old_apriori;
503  }
Database __score_database
the database to be used by the scores and parameter estimators
const std::vector< std::string > & missingSymbols() const
returns the set of missing symbols taken into account
AprioriType __apriori_type
the a priori selected for the score and parameters
Database * __apriori_database
the database used by the Dirichlet a priori
Apriori * __apriori
the apriori used
std::string __apriori_dbname
the filename for the Dirichlet a priori, if any
const DatabaseTable & databaseTable() const
returns the internal database table
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
DBRowGeneratorParser & parser()
returns the parser for the database
double __apriori_weight
the weight of the apriori
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ __createCorrectedMutualInformation()

void gum::learning::genericBNLearner::__createCorrectedMutualInformation ( )
protected

create the Corrected Mutual Information instance for Miic/3off2

Definition at line 653 of file genericBNLearner.cpp.

References __3off2_kmode, __mutual_info, __no_apriori, __ranges, __score_database, GUM_ERROR, gum::learning::genericBNLearner::Database::nodeId2Columns(), gum::learning::genericBNLearner::Database::parser(), gum::learning::CorrectedMutualInformation< ALLOC >::useMDL(), gum::learning::CorrectedMutualInformation< ALLOC >::useNML(), and gum::learning::CorrectedMutualInformation< ALLOC >::useNoCorr().

Referenced by __learnDAG().

653  {
654  if (__mutual_info != nullptr) delete __mutual_info;
655 
656  __mutual_info =
657  new CorrectedMutualInformation<>(__score_database.parser(),
658  *__no_apriori,
659  __ranges,
661  switch (__3off2_kmode) {
664  break;
665 
668  break;
669 
672  break;
673 
674  default:
675  GUM_ERROR(NotImplementedYet,
676  "The BNLearner's corrected mutual information class does "
677  << "not support yet penalty mode " << int(__3off2_kmode));
678  }
679  }
void useNML()
use the kNML penalty function
Database __score_database
the database to be used by the scores and parameter estimators
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
DBRowGeneratorParser & parser()
returns the parser for the database
void useMDL()
use the MDL penalty function
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
void useNoCorr()
use no correction/penalty function
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ __createParamEstimator()

ParamEstimator * gum::learning::genericBNLearner::__createParamEstimator ( DBRowGeneratorParser<> &  parser,
bool  take_into_account_score = true 
)
protected

create the parameter estimator used for learning

Definition at line 563 of file genericBNLearner.cpp.

References __apriori, __no_apriori, __param_estimator_type, __ranges, __score, __score_database, GUM_ERROR, gum::learning::Score< ALLOC >::internalApriori(), ML, gum::learning::genericBNLearner::Database::nodeId2Columns(), and gum::learning::ParamEstimator< ALLOC >::setRanges().

564  {
565  ParamEstimator<>* param_estimator = nullptr;
566 
567  // create the new estimator
568  switch (__param_estimator_type) {
570  if (take_into_account_score && (__score != nullptr)) {
571  param_estimator =
572  new ParamEstimatorML<>(parser,
573  *__apriori,
575  __ranges,
577  } else {
578  param_estimator =
579  new ParamEstimatorML<>(parser,
580  *__apriori,
581  *__no_apriori,
582  __ranges,
584  }
585 
586  break;
587 
588  default:
589  GUM_ERROR(OperationNotAllowed,
590  "genericBNLearner does not support "
591  << "yet this parameter estimator");
592  }
593 
594  // assign the set of ranges
595  param_estimator->setRanges(__ranges);
596 
597  return param_estimator;
598  }
Score * __score
the score used
Database __score_database
the database to be used by the scores and parameter estimators
ParamEstimatorType __param_estimator_type
the type of the parameter estimator
Apriori * __apriori
the apriori used
virtual const Apriori< ALLOC > & internalApriori() const =0
returns the internal apriori of the score
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ __createScore()

void gum::learning::genericBNLearner::__createScore ( )
protected

create the score used for learning

Definition at line 505 of file genericBNLearner.cpp.

References __apriori, __ranges, __score, __score_database, __score_type, AIC, BD, BDeu, BIC, GUM_ERROR, K2, LOG2LIKELIHOOD, gum::learning::genericBNLearner::Database::nodeId2Columns(), and gum::learning::genericBNLearner::Database::parser().

Referenced by learnDAG().

505  {
506  // first, save the old score, to be delete if everything is ok
507  Score<>* old_score = __score;
508 
509  // create the new scoring function
510  switch (__score_type) {
511  case ScoreType::AIC:
512  __score = new ScoreAIC<>(__score_database.parser(),
513  *__apriori,
514  __ranges,
516  break;
517 
518  case ScoreType::BD:
519  __score = new ScoreBD<>(__score_database.parser(),
520  *__apriori,
521  __ranges,
523  break;
524 
525  case ScoreType::BDeu:
526  __score = new ScoreBDeu<>(__score_database.parser(),
527  *__apriori,
528  __ranges,
530  break;
531 
532  case ScoreType::BIC:
533  __score = new ScoreBIC<>(__score_database.parser(),
534  *__apriori,
535  __ranges,
537  break;
538 
539  case ScoreType::K2:
540  __score = new ScoreK2<>(__score_database.parser(),
541  *__apriori,
542  __ranges,
544  break;
545 
547  __score = new ScoreLog2Likelihood<>(__score_database.parser(),
548  *__apriori,
549  __ranges,
551  break;
552 
553  default:
554  GUM_ERROR(OperationNotAllowed,
555  "genericBNLearner does not support yet this score");
556  }
557 
558  // remove the old score, if any
559  if (old_score != nullptr) delete old_score;
560  }
Score * __score
the score used
Database __score_database
the database to be used by the scores and parameter estimators
Apriori * __apriori
the apriori used
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
ScoreType __score_type
the score selected for learning
DBRowGeneratorParser & parser()
returns the parser for the database
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ __getAprioriType()

INLINE const std::string & gum::learning::genericBNLearner::__getAprioriType ( ) const
protected

returns the type (as a string) of a given apriori

Definition at line 412 of file genericBNLearner_inl.h.

References __apriori_type, BDEU, DIRICHLET_FROM_DATABASE, GUM_ERROR, NO_APRIORI, and SMOOTHING.

Referenced by checkScoreAprioriCompatibility().

412  {
413  switch (__apriori_type) {
415 
417 
420 
422 
423  default:
424  GUM_ERROR(OperationNotAllowed,
425  "genericBNLearner getAprioriType does "
426  "not support yet this apriori");
427  }
428  }
static const std::string type
Definition: aprioriTypes.h:40
static const std::string type
Definition: aprioriTypes.h:45
static const std::string type
Definition: aprioriTypes.h:50
AprioriType __apriori_type
the a priori selected for the score and parameters
static const std::string type
Definition: aprioriTypes.h:35
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the caller graph for this function:

◆ __learnDAG()

DAG gum::learning::genericBNLearner::__learnDAG ( )
protected

returns the DAG learnt

Definition at line 681 of file genericBNLearner.cpp.

References __apriori_database, __apriori_type, __constraint_ForbiddenArcs, __constraint_Indegree, __constraint_MandatoryArcs, __constraint_SliceOrder, __constraint_TabuList, __createCorrectedMutualInformation(), __greedy_hill_climbing, __initial_dag, __K2, __local_search_with_tabu_list, __miic_3off2, __mutual_info, __prepare_miic_3off2(), __score, __score_database, __selected_algo, gum::DAG::addArc(), gum::NodeGraphPart::addNodeWithId(), gum::learning::K2::approximationScheme(), gum::learning::StructuralConstraintForbiddenArcs::arcs(), gum::learning::StructuralConstraintMandatoryArcs::arcs(), gum::learning::genericBNLearner::Database::databaseTable(), DIRICHLET_FROM_DATABASE, gum::ArcGraphPart::eraseArc(), gum::NodeGraphPart::exists(), GREEDY_HILL_CLIMBING, GUM_ERROR, gum::learning::IDatabaseTable< T_DATA, ALLOC >::hasMissingValues(), K2, gum::learning::K2::learnStructure(), gum::learning::GreedyHillClimbing::learnStructure(), gum::learning::LocalSearchWithTabuList::learnStructure(), gum::learning::Miic::learnStructure(), LOCAL_SEARCH_WITH_TABU_LIST, MIIC_THREE_OFF_TWO, gum::learning::K2::order(), and gum::SequenceImplementation< Key, Alloc, Gen >::pos().

Referenced by learnDAG().

681  {
682  // check that the database does not contain any missing value
684  || ((__apriori_database != nullptr)
687  GUM_ERROR(MissingValueInDatabase,
688  "For the moment, the BNLearner is unable to cope "
689  "with missing values in databases");
690  }
691  // add the mandatory arcs to the initial dag and remove the forbidden ones
692  // from the initial graph
693  DAG init_graph = __initial_dag;
694 
695  const ArcSet& mandatory_arcs = __constraint_MandatoryArcs.arcs();
696 
697  for (const auto& arc : mandatory_arcs) {
698  if (!init_graph.exists(arc.tail())) init_graph.addNodeWithId(arc.tail());
699 
700  if (!init_graph.exists(arc.head())) init_graph.addNodeWithId(arc.head());
701 
702  init_graph.addArc(arc.tail(), arc.head());
703  }
704 
705  const ArcSet& forbidden_arcs = __constraint_ForbiddenArcs.arcs();
706 
707  for (const auto& arc : forbidden_arcs) {
708  init_graph.eraseArc(arc);
709  }
710 
711  switch (__selected_algo) {
712  // ========================================================================
714  BNLearnerListener listener(this, __miic_3off2);
715  // create the mixedGraph
716  MixedGraph mgraph = this->__prepare_miic_3off2();
717 
718  // recreate the corrected mutual information
720 
721  return __miic_3off2.learnStructure(*__mutual_info, mgraph);
722  }
723 
724  // ========================================================================
726  BNLearnerListener listener(this, __greedy_hill_climbing);
727  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
728  StructuralConstraintForbiddenArcs,
729  StructuralConstraintSliceOrder >
730  gen_constraint;
731  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint) =
733  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint) =
735  static_cast< StructuralConstraintSliceOrder& >(gen_constraint) =
737 
738  GraphChangesGenerator4DiGraph< decltype(gen_constraint) > op_set(
739  gen_constraint);
740 
741  StructuralConstraintSetStatic< StructuralConstraintIndegree,
742  StructuralConstraintDAG >
743  sel_constraint;
744  static_cast< StructuralConstraintIndegree& >(sel_constraint) =
746 
747  GraphChangesSelector4DiGraph< decltype(sel_constraint),
748  decltype(op_set) >
749  selector(*__score, sel_constraint, op_set);
750 
751  return __greedy_hill_climbing.learnStructure(selector, init_graph);
752  }
753 
754  // ========================================================================
756  BNLearnerListener listener(this, __local_search_with_tabu_list);
757  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
758  StructuralConstraintForbiddenArcs,
759  StructuralConstraintSliceOrder >
760  gen_constraint;
761  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint) =
763  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint) =
765  static_cast< StructuralConstraintSliceOrder& >(gen_constraint) =
767 
768  GraphChangesGenerator4DiGraph< decltype(gen_constraint) > op_set(
769  gen_constraint);
770 
771  StructuralConstraintSetStatic< StructuralConstraintTabuList,
772  StructuralConstraintIndegree,
773  StructuralConstraintDAG >
774  sel_constraint;
775  static_cast< StructuralConstraintTabuList& >(sel_constraint) =
777  static_cast< StructuralConstraintIndegree& >(sel_constraint) =
779 
780  GraphChangesSelector4DiGraph< decltype(sel_constraint),
781  decltype(op_set) >
782  selector(*__score, sel_constraint, op_set);
783 
785  init_graph);
786  }
787 
788  // ========================================================================
789  case AlgoType::K2: {
790  BNLearnerListener listener(this, __K2.approximationScheme());
791  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
792  StructuralConstraintForbiddenArcs >
793  gen_constraint;
794  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint) =
796  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint) =
798 
799  GraphChangesGenerator4K2< decltype(gen_constraint) > op_set(
800  gen_constraint);
801 
802  // if some mandatory arcs are incompatible with the order, use a DAG
803  // constraint instead of a DiGraph constraint to avoid cycles
804  const ArcSet& mandatory_arcs =
805  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint)
806  .arcs();
807  const Sequence< NodeId >& order = __K2.order();
808  bool order_compatible = true;
809 
810  for (const auto& arc : mandatory_arcs) {
811  if (order.pos(arc.tail()) >= order.pos(arc.head())) {
812  order_compatible = false;
813  break;
814  }
815  }
816 
817  if (order_compatible) {
818  StructuralConstraintSetStatic< StructuralConstraintIndegree,
819  StructuralConstraintDiGraph >
820  sel_constraint;
821  static_cast< StructuralConstraintIndegree& >(sel_constraint) =
823 
824  GraphChangesSelector4DiGraph< decltype(sel_constraint),
825  decltype(op_set) >
826  selector(*__score, sel_constraint, op_set);
827 
828  return __K2.learnStructure(selector, init_graph);
829  } else {
830  StructuralConstraintSetStatic< StructuralConstraintIndegree,
831  StructuralConstraintDAG >
832  sel_constraint;
833  static_cast< StructuralConstraintIndegree& >(sel_constraint) =
835 
836  GraphChangesSelector4DiGraph< decltype(sel_constraint),
837  decltype(op_set) >
838  selector(*__score, sel_constraint, op_set);
839 
840  return __K2.learnStructure(selector, init_graph);
841  }
842  }
843 
844  // ========================================================================
845  default:
846  GUM_ERROR(OperationNotAllowed,
847  "the learnDAG method has not been implemented for this "
848  "learning algorithm");
849  }
850  }
AlgoType __selected_algo
the selected learning algorithm
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
Score * __score
the score used
Database __score_database
the database to be used by the scores and parameter estimators
Idx pos(const Key &key) const
Returns the position of the object passed in argument (if it exists).
Definition: sequence_tpl.h:515
const ArcSet & arcs() const
returns the set of mandatory arcs
StructuralConstraintSliceOrder __constraint_SliceOrder
the constraint for 2TBNs
Miic __miic_3off2
the 3off2 algorithm
Set< Arc > ArcSet
Some typdefs and define for shortcuts ...
MixedGraph __prepare_miic_3off2()
prepares the initial graph for 3off2 or miic
AprioriType __apriori_type
the a priori selected for the score and parameters
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
DAG __initial_dag
an initial DAG given to learners
const Sequence< NodeId > & order() const noexcept
returns the current order
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
Database * __apriori_database
the database used by the Dirichlet a priori
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
bool hasMissingValues() const
indicates whether the database contains some missing values
void __createCorrectedMutualInformation()
create the Corrected Mutual Information instance for Miic/3off2
const ArcSet & arcs() const
returns the set of mandatory arcs
StructuralConstraintTabuList __constraint_TabuList
the constraint for tabu lists
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
const DatabaseTable & databaseTable() const
returns the internal database table
StructuralConstraintIndegree __constraint_Indegree
the constraint for indegrees
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
Definition: K2_tpl.h:38
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
DAG learnStructure(CorrectedMutualInformation<> &I, MixedGraph graph)
learns the structure of an Bayesian network, ie a DAG, by first learning an Essential graph and then ...
Definition: Miic.cpp:984
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ __prepare_miic_3off2()

MixedGraph gum::learning::genericBNLearner::__prepare_miic_3off2 ( )
protected

prepares the initial graph for 3off2 or miic

Definition at line 601 of file genericBNLearner.cpp.

References __constraint_ForbiddenArcs, __constraint_MandatoryArcs, __miic_3off2, __mutual_info, __score_database, gum::learning::Miic::addConstraints(), gum::UndiGraph::addEdge(), gum::NodeGraphPart::addNodeWithId(), gum::learning::StructuralConstraintForbiddenArcs::arcs(), gum::learning::StructuralConstraintMandatoryArcs::arcs(), gum::learning::genericBNLearner::Database::databaseTable(), gum::HashTable< Key, Val, Alloc >::insert(), gum::learning::IDatabaseTable< T_DATA, ALLOC >::nbVariables(), and useNML().

Referenced by __learnDAG(), and learnMixedStructure().

601  {
602  // Initialize the mixed graph to the fully connected graph
603  MixedGraph mgraph;
604  for (Size i = 0; i < __score_database.databaseTable().nbVariables(); ++i) {
605  mgraph.addNodeWithId(i);
606  for (Size j = 0; j < i; ++j) {
607  mgraph.addEdge(j, i);
608  }
609  }
610 
611  // translating the constraints for 3off2 or miic
612  HashTable< std::pair< NodeId, NodeId >, char > initial_marks;
613  const ArcSet& mandatory_arcs = __constraint_MandatoryArcs.arcs();
614  for (const auto& arc : mandatory_arcs) {
615  initial_marks.insert({arc.tail(), arc.head()}, '>');
616  }
617 
618  const ArcSet& forbidden_arcs = __constraint_ForbiddenArcs.arcs();
619  for (const auto& arc : forbidden_arcs) {
620  initial_marks.insert({arc.tail(), arc.head()}, '-');
621  }
622  __miic_3off2.addConstraints(initial_marks);
623  // create the mutual entropy object
624  if (__mutual_info == nullptr) { this->useNML(); }
625 
626  return mgraph;
627  }
Database __score_database
the database to be used by the scores and parameter estimators
const ArcSet & arcs() const
returns the set of mandatory arcs
Miic __miic_3off2
the 3off2 algorithm
void addConstraints(HashTable< std::pair< NodeId, NodeId >, char > constraints)
Set a ensemble of constraints for the orientation phase.
Definition: Miic.cpp:1064
Set< Arc > ArcSet
Some typdefs and define for shortcuts ...
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
void useNML()
indicate that we wish to use the NML correction for 3off2
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
std::size_t nbVariables() const noexcept
returns the number of variables (columns) of the database
const ArcSet & arcs() const
returns the set of mandatory arcs
const DatabaseTable & databaseTable() const
returns the internal database table
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:45
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
void insert(const Key &k)
Inserts a new element into the set.
Definition: set_tpl.h:610
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ __readFile()

DatabaseTable gum::learning::genericBNLearner::__readFile ( const std::string &  filename,
const std::vector< std::string > &  missing_symbols 
)
staticprotected

reads a file and returns a databaseVectInRam

Definition at line 429 of file genericBNLearner.cpp.

References __checkFileName(), database(), gum::learning::IDBInitializer< ALLOC >::fillDatabase(), gum::learning::DBTranslatorSet< ALLOC >::insertTranslator(), gum::learning::DatabaseTable< ALLOC >::reorder(), gum::learning::DatabaseTable< ALLOC >::setVariableNames(), and gum::learning::IDBInitializer< ALLOC >::variableNames().

431  {
432  // get the extension of the file
433  __checkFileName(filename);
434 
435  DBInitializerFromCSV<> initializer(filename);
436 
437  const auto& var_names = initializer.variableNames();
438  const std::size_t nb_vars = var_names.size();
439 
440  DBTranslatorSet<> translator_set;
441  DBTranslator4LabelizedVariable<> translator(missing_symbols);
442  for (std::size_t i = 0; i < nb_vars; ++i) {
443  translator_set.insertTranslator(translator, i);
444  }
445 
446  DatabaseTable<> database(missing_symbols, translator_set);
447  database.setVariableNames(initializer.variableNames());
448  initializer.fillDatabase(database);
449 
450  database.reorder();
451 
452  return database;
453  }
static void __checkFileName(const std::string &filename)
checks whether the extension of a CSV filename is correct
virtual void setVariableNames(const std::vector< std::string, ALLOC< std::string > > &names, const bool from_external_object=true) final
sets the names of the variables
const DatabaseTable & database() const
returns the database used by the BNLearner
void reorder(const std::size_t k, const bool k_is_input_col=false)
performs a reordering of the kth translator or of the first translator parsing the kth column of the ...
+ Here is the call graph for this function:

◆ addForbiddenArc() [1/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const Arc arc)

Definition at line 264 of file genericBNLearner_inl.h.

References __constraint_ForbiddenArcs, and gum::learning::StructuralConstraintForbiddenArcs::addArc().

Referenced by addForbiddenArc().

264  {
266  }
void addArc(const Arc &arc)
assign a new forbidden arc
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ addForbiddenArc() [2/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const NodeId  tail,
const NodeId  head 
)

Definition at line 274 of file genericBNLearner_inl.h.

References addForbiddenArc().

275  {
276  addForbiddenArc(Arc(tail, head));
277  }
+ Here is the call graph for this function:

◆ addForbiddenArc() [3/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const std::string &  tail,
const std::string &  head 
)

Definition at line 286 of file genericBNLearner_inl.h.

References addForbiddenArc(), and gum::learning::genericBNLearner::Database::idFromName().

287  {
288  addForbiddenArc(Arc(idFromName(tail), idFromName(head)));
289  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ addMandatoryArc() [1/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const Arc arc)

Definition at line 303 of file genericBNLearner_inl.h.

References __constraint_MandatoryArcs, and gum::learning::StructuralConstraintMandatoryArcs::addArc().

Referenced by addMandatoryArc().

303  {
305  }
void addArc(const Arc &arc)
assign a new forbidden arc
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ addMandatoryArc() [2/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const NodeId  tail,
const NodeId  head 
)

Definition at line 325 of file genericBNLearner_inl.h.

References addMandatoryArc().

326  {
327  addMandatoryArc(Arc(tail, head));
328  }
+ Here is the call graph for this function:

◆ addMandatoryArc() [3/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const std::string &  tail,
const std::string &  head 
)

Definition at line 313 of file genericBNLearner_inl.h.

References addMandatoryArc(), and gum::learning::genericBNLearner::Database::idFromName().

314  {
315  addMandatoryArc(Arc(idFromName(tail), idFromName(head)));
316  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ checkScoreAprioriCompatibility()

std::string gum::learning::genericBNLearner::checkScoreAprioriCompatibility ( )

checks whether the current score and apriori are compatible

Returns
a non empty string if the apriori is somehow compatible with the score.

Definition at line 852 of file genericBNLearner.cpp.

References __apriori_weight, __getAprioriType(), __score_type, AIC, BD, BDeu, BIC, gum::learning::ScoreAIC< ALLOC >::isAprioriCompatible(), gum::learning::ScoreBIC< ALLOC >::isAprioriCompatible(), gum::learning::ScoreLog2Likelihood< ALLOC >::isAprioriCompatible(), gum::learning::ScoreBDeu< ALLOC >::isAprioriCompatible(), gum::learning::ScoreK2< ALLOC >::isAprioriCompatible(), gum::learning::ScoreBD< ALLOC >::isAprioriCompatible(), K2, and LOG2LIKELIHOOD.

Referenced by setAprioriWeight(), useAprioriBDeu(), useAprioriDirichlet(), useAprioriSmoothing(), useNoApriori(), useScoreAIC(), useScoreBD(), useScoreBDeu(), useScoreBIC(), useScoreK2(), and useScoreLog2Likelihood().

852  {
853  const std::string& apriori = __getAprioriType();
854 
855  switch (__score_type) {
856  case ScoreType::AIC:
858 
859  case ScoreType::BD:
861 
862  case ScoreType::BDeu:
864 
865  case ScoreType::BIC:
867 
868  case ScoreType::K2:
870 
874 
875  default: return "genericBNLearner does not support yet this score";
876  }
877  }
const std::string & __getAprioriType() const
returns the type (as a string) of a given apriori
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
ScoreType __score_type
the score selected for learning
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
double __apriori_weight
the weight of the apriori
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ chi2() [1/2]

std::pair< double, double > gum::learning::genericBNLearner::chi2 ( const NodeId  id1,
const NodeId  id2,
const std::vector< NodeId > &  knowing = {} 
)

Return the <statistic,pvalue> pair for the BNLearner.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 927 of file genericBNLearner.cpp.

References __apriori, __createApriori(), __score_database, databaseRanges(), gum::learning::genericBNLearner::Database::databaseTable(), gum::learning::IDatabaseTable< T_DATA, ALLOC >::handler(), and gum::learning::IndepTestChi2< ALLOC >::statistics().

Referenced by chi2().

928  {
929  __createApriori();
930  DBRowGeneratorParser<> parser(__score_database.databaseTable().handler(),
931  DBRowGeneratorSet<>());
933  parser, *__apriori, databaseRanges());
934 
935  return chi2score.statistics(id1, id2, knowing);
936  }
Database __score_database
the database to be used by the scores and parameter estimators
the class for computing Chi2 independence test scores
Definition: indepTestChi2.h:45
Apriori * __apriori
the apriori used
const DatabaseTable & databaseTable() const
returns the internal database table
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
void __createApriori()
create the apriori used for learning
iterator handler() const
returns a new unsafe handler pointing to the 1st record of the database
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ chi2() [2/2]

std::pair< double, double > gum::learning::genericBNLearner::chi2 ( const std::string &  name1,
const std::string &  name2,
const std::vector< std::string > &  knowing = {} 
)

Return the <statistic,pvalue> pair for the BNLearner.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 939 of file genericBNLearner.cpp.

References chi2(), and idFromName().

941  {
942  std::vector< NodeId > knowingIds;
943  std::transform(
944  knowing.begin(),
945  knowing.end(),
946  std::back_inserter(knowingIds),
947  [this](const std::string& c) -> NodeId { return this->idFromName(c); });
948  return chi2(idFromName(name1), idFromName(name2), knowingIds);
949  }
std::pair< double, double > chi2(const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
Return the <statistic,pvalue> pair for the BNLearner.
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Size NodeId
Type for node ids.
Definition: graphElements.h:97
+ Here is the call graph for this function:

◆ clearDatabaseRanges()

INLINE void gum::learning::genericBNLearner::clearDatabaseRanges ( )

reset the ranges to the one range corresponding to the whole database

Definition at line 448 of file genericBNLearner_inl.h.

References __ranges.

448 { __ranges.clear(); }
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done

◆ currentTime()

double gum::learning::genericBNLearner::currentTime ( ) const
inlinevirtual

get the current running time in second (double)

Implements gum::IApproximationSchemeConfiguration.

Definition at line 947 of file genericBNLearner.h.

References __current_algorithm, gum::ApproximationScheme::currentTime(), and GUM_ERROR.

947  {
948  if (__current_algorithm != nullptr)
950  else
951  GUM_ERROR(FatalError, "No chosen algorithm for learning");
952  };
const ApproximationScheme * __current_algorithm
double currentTime() const
Returns the current running time in second.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ database()

INLINE const DatabaseTable & gum::learning::genericBNLearner::database ( ) const

returns the database used by the BNLearner

Definition at line 451 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::databaseTable().

Referenced by __readFile(), and gum::learning::readFile().

451  {
453  }
Database __score_database
the database to be used by the scores and parameter estimators
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ databaseRanges()

INLINE const std::vector< std::pair< std::size_t, std::size_t > > & gum::learning::genericBNLearner::databaseRanges ( ) const

returns the current database rows' ranges used for learning

Returns
The method returns a vector of pairs [Xi,Yi) of indices of rows in the database. The learning is performed on these set of rows.
Warning
an empty set of ranges means the whole database.

Definition at line 443 of file genericBNLearner_inl.h.

References __ranges.

Referenced by chi2(), and logLikelihood().

443  {
444  return __ranges;
445  }
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
+ Here is the caller graph for this function:

◆ disableEpsilon()

void gum::learning::genericBNLearner::disableEpsilon ( )
inlinevirtual

Disable stopping criterion on epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 812 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::disableEpsilon().

812  {
817  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void disableEpsilon()
Disable stopping criterion on epsilon.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ disableMaxIter()

void gum::learning::genericBNLearner::disableMaxIter ( )
inlinevirtual

Disable stopping criterion on max iterations.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 902 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::disableMaxIter().

902  {
907  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
void disableMaxIter()
Disable stopping criterion on max iterations.
+ Here is the call graph for this function:

◆ disableMaxTime()

void gum::learning::genericBNLearner::disableMaxTime ( )
inlinevirtual

Disable stopping criterion on timeout.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 955 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::disableMaxTime().

955  {
960  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
void disableMaxTime()
Disable stopping criterion on timeout.
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ disableMinEpsilonRate()

void gum::learning::genericBNLearner::disableMinEpsilonRate ( )
inlinevirtual

Disable stopping criterion on epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 858 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::disableMinEpsilonRate().

858  {
863  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void disableMinEpsilonRate()
Disable stopping criterion on epsilon rate.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ distributeProgress()

INLINE void gum::learning::genericBNLearner::distributeProgress ( const ApproximationScheme approximationScheme,
Size  pourcent,
double  error,
double  time 
)
inline

{@ /// distribute signals

Definition at line 774 of file genericBNLearner.h.

References GUM_EMIT3, gum::IApproximationSchemeConfiguration::onProgress, and setCurrentApproximationScheme().

Referenced by gum::learning::BNLearnerListener::whenProgress().

777  {
778  setCurrentApproximationScheme(approximationScheme);
779 
780  if (onProgress.hasListener()) GUM_EMIT3(onProgress, pourcent, error, time);
781  };
INLINE void setCurrentApproximationScheme(const ApproximationScheme *approximationScheme)
{@ /// distribute signals
Signaler3< Size, double, double > onProgress
Progression, error and time.
#define GUM_EMIT3(signal, arg1, arg2, arg3)
Definition: signaler3.h:40
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ distributeStop()

INLINE void gum::learning::genericBNLearner::distributeStop ( const ApproximationScheme approximationScheme,
std::string  message 
)
inline

distribute signals

Definition at line 784 of file genericBNLearner.h.

References GUM_EMIT1, gum::IApproximationSchemeConfiguration::onStop, and setCurrentApproximationScheme().

Referenced by gum::learning::BNLearnerListener::whenStop().

785  {
786  setCurrentApproximationScheme(approximationScheme);
787 
788  if (onStop.hasListener()) GUM_EMIT1(onStop, message);
789  };
INLINE void setCurrentApproximationScheme(const ApproximationScheme *approximationScheme)
{@ /// distribute signals
#define GUM_EMIT1(signal, arg1)
Definition: signaler1.h:40
Signaler1< std::string > onStop
Criteria messageApproximationScheme.
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ domainSizes()

INLINE const std::vector< std::size_t > & gum::learning::genericBNLearner::domainSizes ( ) const

returns the domain sizes of the variables in the database

Definition at line 437 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::domainSizes().

437  {
438  return __score_database.domainSizes();
439  }
Database __score_database
the database to be used by the scores and parameter estimators
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
+ Here is the call graph for this function:

◆ enableEpsilon()

void gum::learning::genericBNLearner::enableEpsilon ( )
inlinevirtual

Enable stopping criterion on epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 820 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::enableEpsilon().

820  {
825  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
void enableEpsilon()
Enable stopping criterion on epsilon.
+ Here is the call graph for this function:

◆ enableMaxIter()

void gum::learning::genericBNLearner::enableMaxIter ( )
inlinevirtual

Enable stopping criterion on max iterations.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 909 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::enableMaxIter().

909  {
914  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
void enableMaxIter()
Enable stopping criterion on max iterations.
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ enableMaxTime()

void gum::learning::genericBNLearner::enableMaxTime ( )
inlinevirtual

stopping criterion on timeout If the criterion was disabled it will be enabled

Exceptions
OutOfLowerBoundif timeout<=0.0 timeout is time in second (double).

Implements gum::IApproximationSchemeConfiguration.

Definition at line 961 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::enableMaxTime().

961  {
966  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
void enableMaxTime()
Enable stopping criterion on timeout.
+ Here is the call graph for this function:

◆ enableMinEpsilonRate()

void gum::learning::genericBNLearner::enableMinEpsilonRate ( )
inlinevirtual

Enable stopping criterion on epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 865 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::enableMinEpsilonRate().

865  {
870  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void enableMinEpsilonRate()
Enable stopping criterion on epsilon rate.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ epsilon()

double gum::learning::genericBNLearner::epsilon ( ) const
inlinevirtual

Get the value of epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 804 of file genericBNLearner.h.

References __current_algorithm, gum::ApproximationScheme::epsilon(), and GUM_ERROR.

Referenced by useEM().

804  {
805  if (__current_algorithm != nullptr)
806  return __current_algorithm->epsilon();
807  else
808  GUM_ERROR(FatalError, "No chosen algorithm for learning");
809  };
const ApproximationScheme * __current_algorithm
double epsilon() const
Returns the value of epsilon.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ eraseForbiddenArc() [1/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const Arc arc)

Definition at line 269 of file genericBNLearner_inl.h.

References __constraint_ForbiddenArcs, and gum::learning::StructuralConstraintForbiddenArcs::eraseArc().

Referenced by eraseForbiddenArc().

269  {
271  }
void eraseArc(const Arc &arc)
remove a forbidden arc
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ eraseForbiddenArc() [2/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const NodeId  tail,
const NodeId  head 
)

Definition at line 280 of file genericBNLearner_inl.h.

References eraseForbiddenArc().

281  {
282  eraseForbiddenArc(Arc(tail, head));
283  }
+ Here is the call graph for this function:

◆ eraseForbiddenArc() [3/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const std::string &  tail,
const std::string &  head 
)

Definition at line 292 of file genericBNLearner_inl.h.

References eraseForbiddenArc(), and gum::learning::genericBNLearner::Database::idFromName().

293  {
294  eraseForbiddenArc(Arc(idFromName(tail), idFromName(head)));
295  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ eraseMandatoryArc() [1/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const Arc arc)

Definition at line 308 of file genericBNLearner_inl.h.

References __constraint_MandatoryArcs, and gum::learning::StructuralConstraintMandatoryArcs::eraseArc().

Referenced by eraseMandatoryArc().

308  {
310  }
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
void eraseArc(const Arc &arc)
remove a forbidden arc
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ eraseMandatoryArc() [2/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const NodeId  tail,
const NodeId  head 
)

Definition at line 331 of file genericBNLearner_inl.h.

References eraseMandatoryArc().

332  {
333  eraseMandatoryArc(Arc(tail, head));
334  }
+ Here is the call graph for this function:

◆ eraseMandatoryArc() [3/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const std::string &  tail,
const std::string &  head 
)

Definition at line 319 of file genericBNLearner_inl.h.

References eraseMandatoryArc(), and gum::learning::genericBNLearner::Database::idFromName().

320  {
321  eraseMandatoryArc(Arc(idFromName(tail), idFromName(head)));
322  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ hasMissingValues()

INLINE bool gum::learning::genericBNLearner::hasMissingValues ( ) const

returns true if the learner's database has missing values

Definition at line 254 of file genericBNLearner_inl.h.

References __score_database, gum::learning::genericBNLearner::Database::databaseTable(), and gum::learning::IDatabaseTable< T_DATA, ALLOC >::hasMissingValues().

254  {
256  }
Database __score_database
the database to be used by the scores and parameter estimators
bool hasMissingValues() const
indicates whether the database contains some missing values
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:

◆ history()

const std::vector< double >& gum::learning::genericBNLearner::history ( ) const
inlinevirtual
Exceptions
OperationNotAllowedif scheme not performed or verbosity=false

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1031 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::history().

1031  {
1032  if (__current_algorithm != nullptr)
1033  return __current_algorithm->history();
1034  else
1035  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1036  };
const ApproximationScheme * __current_algorithm
const std::vector< double > & history() const
Returns the scheme history.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ idFromName()

INLINE NodeId gum::learning::genericBNLearner::idFromName ( const std::string &  var_name) const

returns the node id corresponding to a variable name

Exceptions
MissingVariableInDatabaseif a variable of the BN is not found in the database.

Definition at line 111 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::idFromName().

Referenced by chi2(), and logLikelihood().

111  {
112  return __score_database.idFromName(var_name);
113  }
Database __score_database
the database to be used by the scores and parameter estimators
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ isEnabledEpsilon()

bool gum::learning::genericBNLearner::isEnabledEpsilon ( ) const
inlinevirtual
Returns
true if stopping criterion on epsilon is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 829 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::isEnabledEpsilon().

829  {
830  if (__current_algorithm != nullptr)
832  else
833  GUM_ERROR(FatalError, "No chosen algorithm for learning");
834  };
const ApproximationScheme * __current_algorithm
bool isEnabledEpsilon() const
Returns true if stopping criterion on epsilon is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ isEnabledMaxIter()

bool gum::learning::genericBNLearner::isEnabledMaxIter ( ) const
inlinevirtual
Returns
true if stopping criterion on max iterations is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 917 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::isEnabledMaxIter().

917  {
918  if (__current_algorithm != nullptr)
920  else
921  GUM_ERROR(FatalError, "No chosen algorithm for learning");
922  };
const ApproximationScheme * __current_algorithm
bool isEnabledMaxIter() const
Returns true if stopping criterion on max iterations is enabled, false otherwise. ...
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ isEnabledMaxTime()

bool gum::learning::genericBNLearner::isEnabledMaxTime ( ) const
inlinevirtual
Returns
true if stopping criterion on timeout is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 969 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::isEnabledMaxTime().

969  {
970  if (__current_algorithm != nullptr)
972  else
973  GUM_ERROR(FatalError, "No chosen algorithm for learning");
974  };
const ApproximationScheme * __current_algorithm
bool isEnabledMaxTime() const
Returns true if stopping criterion on timeout is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ isEnabledMinEpsilonRate()

bool gum::learning::genericBNLearner::isEnabledMinEpsilonRate ( ) const
inlinevirtual
Returns
true if stopping criterion on epsilon rate is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 873 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::isEnabledMinEpsilonRate().

873  {
874  if (__current_algorithm != nullptr)
876  else
877  GUM_ERROR(FatalError, "No chosen algorithm for learning");
878  };
const ApproximationScheme * __current_algorithm
bool isEnabledMinEpsilonRate() const
Returns true if stopping criterion on epsilon rate is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ latentVariables()

INLINE const std::vector< Arc > gum::learning::genericBNLearner::latentVariables ( ) const

get the list of arcs hiding latent variables

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 214 of file genericBNLearner_inl.h.

References __miic_3off2, __selected_algo, GUM_ERROR, gum::learning::Miic::latentVariables(), and MIIC_THREE_OFF_TWO.

214  {
216  GUM_ERROR(OperationNotAllowed,
217  "You must use the 3off2 algorithm before selecting "
218  << "the latentVariables method");
219  }
220  return __miic_3off2.latentVariables();
221  }
AlgoType __selected_algo
the selected learning algorithm
Miic __miic_3off2
the 3off2 algorithm
const std::vector< Arc > latentVariables() const
get the list of arcs hiding latent variables
Definition: Miic.cpp:1046
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ learnDAG()

DAG gum::learning::genericBNLearner::learnDAG ( )

learn a structure from a file (must have read the db before)

Definition at line 645 of file genericBNLearner.cpp.

References __createApriori(), __createScore(), and __learnDAG().

645  {
646  // create the score and the apriori
647  __createApriori();
648  __createScore();
649 
650  return __learnDAG();
651  }
void __createScore()
create the score used for learning
DAG __learnDAG()
returns the DAG learnt
void __createApriori()
create the apriori used for learning
+ Here is the call graph for this function:

◆ learnMixedStructure()

MixedGraph gum::learning::genericBNLearner::learnMixedStructure ( )

learn a partial structure from a file (must have read the db before and must have selected miic or 3off2)

Definition at line 629 of file genericBNLearner.cpp.

References __miic_3off2, __mutual_info, __prepare_miic_3off2(), __score_database, __selected_algo, gum::learning::genericBNLearner::Database::databaseTable(), GUM_ERROR, gum::learning::IDatabaseTable< T_DATA, ALLOC >::hasMissingValues(), gum::learning::Miic::learnMixedStructure(), and MIIC_THREE_OFF_TWO.

629  {
631  GUM_ERROR(OperationNotAllowed, "Must be using the miic/3off2 algorithm");
632  }
633  // check that the database does not contain any missing value
635  GUM_ERROR(MissingValueInDatabase,
636  "For the moment, the BNLearner is unable to learn "
637  << "structures with missing values in databases");
638  }
639  BNLearnerListener listener(this, __miic_3off2);
640  // create the mixedGraph_constraint_MandatoryArcs.arcs();
641  MixedGraph mgraph = this->__prepare_miic_3off2();
643  }
AlgoType __selected_algo
the selected learning algorithm
Database __score_database
the database to be used by the scores and parameter estimators
MixedGraph learnMixedStructure(CorrectedMutualInformation<> &I, MixedGraph graph)
learns the structure of an Essential Graph
Definition: Miic.cpp:110
Miic __miic_3off2
the 3off2 algorithm
MixedGraph __prepare_miic_3off2()
prepares the initial graph for 3off2 or miic
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
bool hasMissingValues() const
indicates whether the database contains some missing values
const DatabaseTable & databaseTable() const
returns the internal database table
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ logLikelihood() [1/2]

double gum::learning::genericBNLearner::logLikelihood ( const std::vector< NodeId > &  vars,
const std::vector< NodeId > &  knowing = {} 
)

Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.

Parameters
varsa vector of NodeIds
knowingan optional vector of conditioning NodeIds
Returns
a std::pair<double,double>

Definition at line 951 of file genericBNLearner.cpp.

References __apriori, __createApriori(), __score_database, databaseRanges(), gum::learning::genericBNLearner::Database::databaseTable(), gum::learning::IDatabaseTable< T_DATA, ALLOC >::handler(), and gum::learning::ScoreLog2Likelihood< ALLOC >::score().

Referenced by logLikelihood().

952  {
953  __createApriori();
954  DBRowGeneratorParser<> parser(__score_database.databaseTable().handler(),
955  DBRowGeneratorSet<>());
957  parser, *__apriori, databaseRanges());
958 
959  std::vector< NodeId > total(vars);
960  total.insert(total.end(), knowing.begin(), knowing.end());
961  double LLtotal = ll2score.score(IdSet<>(total, false, true));
962  if (knowing.size() == (Size)0) {
963  return LLtotal;
964  } else {
965  double LLknw = ll2score.score(IdSet<>(knowing, false, true));
966  return LLtotal - LLknw;
967  }
968  }
Database __score_database
the database to be used by the scores and parameter estimators
the class for computing Log2-likelihood scores
Apriori * __apriori
the apriori used
const DatabaseTable & databaseTable() const
returns the internal database table
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
void __createApriori()
create the apriori used for learning
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:45
iterator handler() const
returns a new unsafe handler pointing to the 1st record of the database
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ logLikelihood() [2/2]

double gum::learning::genericBNLearner::logLikelihood ( const std::vector< std::string > &  vars,
const std::vector< std::string > &  knowing = {} 
)

Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.

Parameters
varsa vector of name of rows
knowingan optional vector of conditioning rows
Returns
a std::pair<double,double>

Definition at line 971 of file genericBNLearner.cpp.

References idFromName(), and logLikelihood().

972  {
973  std::vector< NodeId > ids;
974  std::vector< NodeId > knowingIds;
975 
976  auto mapper = [this](const std::string& c) -> NodeId {
977  return this->idFromName(c);
978  };
979 
980  std::transform(vars.begin(), vars.end(), std::back_inserter(ids), mapper);
981  std::transform(
982  knowing.begin(), knowing.end(), std::back_inserter(knowingIds), mapper);
983 
984  return logLikelihood(ids, knowingIds);
985  }
double logLikelihood(const std::vector< NodeId > &vars, const std::vector< NodeId > &knowing={})
Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Size NodeId
Type for node ids.
Definition: graphElements.h:97
+ Here is the call graph for this function:

◆ maxIter()

Size gum::learning::genericBNLearner::maxIter ( ) const
inlinevirtual
Returns
the criterion on number of iterations

Implements gum::IApproximationSchemeConfiguration.

Definition at line 894 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::maxIter().

894  {
895  if (__current_algorithm != nullptr)
896  return __current_algorithm->maxIter();
897  else
898  GUM_ERROR(FatalError, "No chosen algorithm for learning");
899  };
const ApproximationScheme * __current_algorithm
Size maxIter() const
Returns the criterion on number of iterations.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ maxTime()

double gum::learning::genericBNLearner::maxTime ( ) const
inlinevirtual

returns the timeout (in seconds)

Implements gum::IApproximationSchemeConfiguration.

Definition at line 939 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::maxTime().

939  {
940  if (__current_algorithm != nullptr)
941  return __current_algorithm->maxTime();
942  else
943  GUM_ERROR(FatalError, "No chosen algorithm for learning");
944  };
double maxTime() const
Returns the timeout (in seconds).
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ messageApproximationScheme()

INLINE std::string gum::IApproximationSchemeConfiguration::messageApproximationScheme ( ) const
inherited

Returns the approximation scheme message.

Returns
Returns the approximation scheme message.

Definition at line 38 of file IApproximationSchemeConfiguration_inl.h.

References gum::IApproximationSchemeConfiguration::Continue, gum::IApproximationSchemeConfiguration::Epsilon, gum::IApproximationSchemeConfiguration::epsilon(), gum::IApproximationSchemeConfiguration::Limit, gum::IApproximationSchemeConfiguration::maxIter(), gum::IApproximationSchemeConfiguration::maxTime(), gum::IApproximationSchemeConfiguration::minEpsilonRate(), gum::IApproximationSchemeConfiguration::Rate, gum::IApproximationSchemeConfiguration::stateApproximationScheme(), gum::IApproximationSchemeConfiguration::Stopped, gum::IApproximationSchemeConfiguration::TimeLimit, and gum::IApproximationSchemeConfiguration::Undefined.

Referenced by gum::ApproximationScheme::_stopScheme(), gum::ApproximationScheme::continueApproximationScheme(), and gum::credal::InferenceEngine< GUM_SCALAR >::getApproximationSchemeMsg().

38  {
39  std::stringstream s;
40 
41  switch (stateApproximationScheme()) {
42  case ApproximationSchemeSTATE::Continue: s << "in progress"; break;
43 
45  s << "stopped with epsilon=" << epsilon();
46  break;
47 
49  s << "stopped with rate=" << minEpsilonRate();
50  break;
51 
53  s << "stopped with max iteration=" << maxIter();
54  break;
55 
57  s << "stopped with timeout=" << maxTime();
58  break;
59 
60  case ApproximationSchemeSTATE::Stopped: s << "stopped on request"; break;
61 
62  case ApproximationSchemeSTATE::Undefined: s << "undefined state"; break;
63  };
64 
65  return s.str();
66  }
virtual double epsilon() const =0
Returns the value of epsilon.
virtual ApproximationSchemeSTATE stateApproximationScheme() const =0
Returns the approximation scheme state.
virtual double maxTime() const =0
Returns the timeout (in seconds).
virtual Size maxIter() const =0
Returns the criterion on number of iterations.
virtual double minEpsilonRate() const =0
Returns the value of the minimal epsilon rate.
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ minEpsilonRate()

double gum::learning::genericBNLearner::minEpsilonRate ( ) const
inlinevirtual

Get the value of the minimal epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 850 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::minEpsilonRate().

850  {
851  if (__current_algorithm != nullptr)
853  else
854  GUM_ERROR(FatalError, "No chosen algorithm for learning");
855  };
double minEpsilonRate() const
Returns the value of the minimal epsilon rate.
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ nameFromId()

INLINE const std::string & gum::learning::genericBNLearner::nameFromId ( NodeId  id) const

returns the variable name corresponding to a given node id

Definition at line 116 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::nameFromId().

116  {
117  return __score_database.nameFromId(id);
118  }
Database __score_database
the database to be used by the scores and parameter estimators
const std::string & nameFromId(NodeId id) const
returns the variable name corresponding to a given node id
+ Here is the call graph for this function:

◆ names()

INLINE const std::vector< std::string > & gum::learning::genericBNLearner::names ( ) const

returns the names of the variables in the database

Definition at line 431 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::names().

431  {
432  return __score_database.names();
433  }
Database __score_database
the database to be used by the scores and parameter estimators
const std::vector< std::string > & names() const
returns the names of the variables in the database
+ Here is the call graph for this function:

◆ nbCols()

INLINE Size gum::learning::genericBNLearner::nbCols ( ) const
Returns
the number of cols in the database

Definition at line 455 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::domainSizes().

455  {
456  return __score_database.domainSizes().size();
457  }
Database __score_database
the database to be used by the scores and parameter estimators
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
+ Here is the call graph for this function:

◆ nbrIterations()

Size gum::learning::genericBNLearner::nbrIterations ( ) const
inlinevirtual
Exceptions
OperationNotAllowedif scheme not performed

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1023 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::nbrIterations().

1023  {
1024  if (__current_algorithm != nullptr)
1026  else
1027  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1028  };
Size nbrIterations() const
Returns the number of iterations.
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ nbRows()

INLINE Size gum::learning::genericBNLearner::nbRows ( ) const
Returns
the number of rows in the database

Definition at line 459 of file genericBNLearner_inl.h.

References __score_database, gum::learning::genericBNLearner::Database::databaseTable(), and gum::learning::IDatabaseTable< T_DATA, ALLOC >::size().

459  {
461  }
Database __score_database
the database to be used by the scores and parameter estimators
std::size_t size() const noexcept
returns the number of records (rows) in the database
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:

◆ operator=() [1/2]

genericBNLearner & gum::learning::genericBNLearner::operator= ( const genericBNLearner from)

copy operator

Definition at line 270 of file genericBNLearner.cpp.

References __3off2_kmode, __apriori, __apriori_database, __apriori_dbname, __apriori_type, __apriori_weight, __constraint_ForbiddenArcs, __constraint_Indegree, __constraint_MandatoryArcs, __constraint_SliceOrder, __constraint_TabuList, __current_algorithm, __EMepsilon, __greedy_hill_climbing, __initial_dag, __K2, __local_search_with_tabu_list, __miic_3off2, __mutual_info, __param_estimator_type, __ranges, __score, __score_database, __score_type, and __selected_algo.

270  {
271  if (this != &from) {
272  if (__score) {
273  delete __score;
274  __score = nullptr;
275  }
276 
277  if (__apriori) {
278  delete __apriori;
279  __apriori = nullptr;
280  }
281 
282  if (__apriori_database) {
283  delete __apriori_database;
284  __apriori_database = nullptr;
285  }
286 
287  if (__mutual_info) {
288  delete __mutual_info;
289  __mutual_info = nullptr;
290  }
291 
292  __score_type = from.__score_type;
293  __param_estimator_type = from.__param_estimator_type;
294  __EMepsilon = from.__EMepsilon;
295  __apriori_type = from.__apriori_type;
296  __apriori_weight = from.__apriori_weight;
297  __constraint_SliceOrder = from.__constraint_SliceOrder;
298  __constraint_Indegree = from.__constraint_Indegree;
299  __constraint_TabuList = from.__constraint_TabuList;
300  __constraint_ForbiddenArcs = from.__constraint_ForbiddenArcs;
301  __constraint_MandatoryArcs = from.__constraint_MandatoryArcs;
302  __selected_algo = from.__selected_algo;
303  __K2 = from.__K2;
304  __miic_3off2 = from.__miic_3off2;
305  __3off2_kmode = from.__3off2_kmode;
306  __greedy_hill_climbing = from.__greedy_hill_climbing;
307  __local_search_with_tabu_list = from.__local_search_with_tabu_list;
308  __score_database = from.__score_database;
309  __ranges = from.__ranges;
310  __apriori_dbname = from.__apriori_dbname;
311  __initial_dag = from.__initial_dag;
312  __current_algorithm = nullptr;
313  }
314 
315  return *this;
316  }
AlgoType __selected_algo
the selected learning algorithm
Score * __score
the score used
Database __score_database
the database to be used by the scores and parameter estimators
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
double __EMepsilon
epsilon for EM. if espilon=0.0 : no EM
StructuralConstraintSliceOrder __constraint_SliceOrder
the constraint for 2TBNs
Miic __miic_3off2
the 3off2 algorithm
ParamEstimatorType __param_estimator_type
the type of the parameter estimator
AprioriType __apriori_type
the a priori selected for the score and parameters
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
DAG __initial_dag
an initial DAG given to learners
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
Database * __apriori_database
the database used by the Dirichlet a priori
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
const ApproximationScheme * __current_algorithm
Apriori * __apriori
the apriori used
StructuralConstraintTabuList __constraint_TabuList
the constraint for tabu lists
std::string __apriori_dbname
the filename for the Dirichlet a priori, if any
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
StructuralConstraintIndegree __constraint_Indegree
the constraint for indegrees
ScoreType __score_type
the score selected for learning
double __apriori_weight
the weight of the apriori
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs

◆ operator=() [2/2]

genericBNLearner & gum::learning::genericBNLearner::operator= ( genericBNLearner &&  from)

move operator

Definition at line 318 of file genericBNLearner.cpp.

References __3off2_kmode, __apriori, __apriori_database, __apriori_dbname, __apriori_type, __apriori_weight, __constraint_ForbiddenArcs, __constraint_Indegree, __constraint_MandatoryArcs, __constraint_SliceOrder, __constraint_TabuList, __current_algorithm, __EMepsilon, __greedy_hill_climbing, __initial_dag, __K2, __local_search_with_tabu_list, __miic_3off2, __mutual_info, __param_estimator_type, __ranges, __score, __score_database, __score_type, and __selected_algo.

318  {
319  if (this != &from) {
320  if (__score) {
321  delete __score;
322  __score = nullptr;
323  }
324 
325  if (__apriori) {
326  delete __apriori;
327  __apriori = nullptr;
328  }
329 
330  if (__apriori_database) {
331  delete __apriori_database;
332  __apriori_database = nullptr;
333  }
334 
335  if (__mutual_info) {
336  delete __mutual_info;
337  __mutual_info = nullptr;
338  }
339 
340  __score_type = from.__score_type;
341  __param_estimator_type = from.__param_estimator_type;
342  __EMepsilon = from.__EMepsilon;
343  __apriori_type = from.__apriori_type;
344  __apriori_weight = from.__apriori_weight;
345  __constraint_SliceOrder = std::move(from.__constraint_SliceOrder);
346  __constraint_Indegree = std::move(from.__constraint_Indegree);
347  __constraint_TabuList = std::move(from.__constraint_TabuList);
348  __constraint_ForbiddenArcs = std::move(from.__constraint_ForbiddenArcs);
349  __constraint_MandatoryArcs = std::move(from.__constraint_MandatoryArcs);
350  __selected_algo = from.__selected_algo;
351  __K2 = from.__K2;
352  __miic_3off2 = std::move(from.__miic_3off2);
353  __3off2_kmode = from.__3off2_kmode;
354  __greedy_hill_climbing = std::move(from.__greedy_hill_climbing);
356  std::move(from.__local_search_with_tabu_list);
357  __score_database = std::move(from.__score_database);
358  __ranges = std::move(from.__ranges);
359  __apriori_dbname = std::move(from.__apriori_dbname);
360  __initial_dag = std::move(from.__initial_dag);
361  __current_algorithm = nullptr;
362  }
363 
364  return *this;
365  }
AlgoType __selected_algo
the selected learning algorithm
Score * __score
the score used
Database __score_database
the database to be used by the scores and parameter estimators
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
double __EMepsilon
epsilon for EM. if espilon=0.0 : no EM
StructuralConstraintSliceOrder __constraint_SliceOrder
the constraint for 2TBNs
Miic __miic_3off2
the 3off2 algorithm
ParamEstimatorType __param_estimator_type
the type of the parameter estimator
AprioriType __apriori_type
the a priori selected for the score and parameters
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
DAG __initial_dag
an initial DAG given to learners
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
Database * __apriori_database
the database used by the Dirichlet a priori
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
const ApproximationScheme * __current_algorithm
Apriori * __apriori
the apriori used
StructuralConstraintTabuList __constraint_TabuList
the constraint for tabu lists
std::string __apriori_dbname
the filename for the Dirichlet a priori, if any
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
StructuralConstraintIndegree __constraint_Indegree
the constraint for indegrees
ScoreType __score_type
the score selected for learning
double __apriori_weight
the weight of the apriori
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs

◆ periodSize()

Size gum::learning::genericBNLearner::periodSize ( ) const
inlinevirtual

how many samples between 2 stopping isEnableds

Exceptions
OutOfLowerBoundif p<1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 987 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::periodSize().

987  {
988  if (__current_algorithm != nullptr)
990  else
991  GUM_ERROR(FatalError, "No chosen algorithm for learning");
992  };
Size periodSize() const
Returns the period size.
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ setAprioriWeight()

INLINE void gum::learning::genericBNLearner::setAprioriWeight ( double  weight)

sets the apriori weight

Definition at line 356 of file genericBNLearner_inl.h.

References __apriori_weight, checkScoreAprioriCompatibility(), and GUM_ERROR.

Referenced by useAprioriBDeu(), useAprioriDirichlet(), and useAprioriSmoothing().

356  {
357  if (weight < 0) {
358  GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive");
359  }
360 
361  __apriori_weight = weight;
363  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
double __apriori_weight
the weight of the apriori
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:
+ Here is the caller graph for this function:

◆ setCurrentApproximationScheme()

INLINE void gum::learning::genericBNLearner::setCurrentApproximationScheme ( const ApproximationScheme approximationScheme)
inline

{@ /// distribute signals

Definition at line 768 of file genericBNLearner.h.

References __current_algorithm.

Referenced by gum::learning::BNLearnerListener::BNLearnerListener(), distributeProgress(), and distributeStop().

769  {
770  __current_algorithm = approximationScheme;
771  }
const ApproximationScheme * __current_algorithm
+ Here is the caller graph for this function:

◆ setDatabaseWeight()

INLINE void gum::learning::genericBNLearner::setDatabaseWeight ( const double  new_weight)

assign a weight to all the rows of the learning database so that the sum of their weights is equal to new_weight

assign new weight to the rows of the learning database

Definition at line 121 of file genericBNLearner_inl.h.

References __score_database, and gum::learning::genericBNLearner::Database::setDatabaseWeight().

121  {
123  }
Database __score_database
the database to be used by the scores and parameter estimators
void setDatabaseWeight(const double new_weight)
assign a weight to all the rows of the database so that the sum of their weights is equal to new_weig...
+ Here is the call graph for this function:

◆ setEpsilon()

void gum::learning::genericBNLearner::setEpsilon ( double  eps)
inlinevirtual

Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)| If the criterion was disabled it will be enabled.

Exceptions
OutOfLowerBoundif eps<0

Implements gum::IApproximationSchemeConfiguration.

Definition at line 796 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setEpsilon().

796  {
800  __Dag2BN.setEpsilon(eps);
801  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
void setEpsilon(double eps)
Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)|.
+ Here is the call graph for this function:

◆ setForbiddenArcs()

INLINE void gum::learning::genericBNLearner::setForbiddenArcs ( const ArcSet set)

assign a set of forbidden arcs

Definition at line 259 of file genericBNLearner_inl.h.

References __constraint_ForbiddenArcs, and gum::learning::StructuralConstraintForbiddenArcs::setArcs().

259  {
261  }
void setArcs(const ArcSet &set)
assign a set of forbidden arcs
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ setInitialDAG()

INLINE void gum::learning::genericBNLearner::setInitialDAG ( const DAG dag)

sets an initial DAG structure

Definition at line 126 of file genericBNLearner_inl.h.

References __initial_dag.

126  {
127  __initial_dag = dag;
128  }
DAG __initial_dag
an initial DAG given to learners

◆ setMandatoryArcs()

INLINE void gum::learning::genericBNLearner::setMandatoryArcs ( const ArcSet set)

assign a set of forbidden arcs

Definition at line 298 of file genericBNLearner_inl.h.

References __constraint_MandatoryArcs, and gum::learning::StructuralConstraintMandatoryArcs::setArcs().

298  {
300  }
void setArcs(const ArcSet &set)
assign a set of forbidden arcs
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ setMaxIndegree()

INLINE void gum::learning::genericBNLearner::setMaxIndegree ( Size  max_indegree)

sets the max indegree

Definition at line 167 of file genericBNLearner_inl.h.

References __constraint_Indegree, and gum::learning::StructuralConstraintIndegree::setMaxIndegree().

167  {
169  }
void setMaxIndegree(Size max_indegree, bool update_all_node=false)
resets the default max indegree and possibly updates the indegree of all nodes
StructuralConstraintIndegree __constraint_Indegree
the constraint for indegrees
+ Here is the call graph for this function:

◆ setMaxIter()

void gum::learning::genericBNLearner::setMaxIter ( Size  max)
inlinevirtual

stopping criterion on number of iterationsIf the criterion was disabled it will be enabled

Parameters
maxThe maximum number of iterations
Exceptions
OutOfLowerBoundif max<=1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 886 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setMaxIter().

886  {
890  __Dag2BN.setMaxIter(max);
891  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
void setMaxIter(Size max)
Stopping criterion on number of iterations.
+ Here is the call graph for this function:

◆ setMaxTime()

void gum::learning::genericBNLearner::setMaxTime ( double  timeout)
inlinevirtual

stopping criterion on timeout If the criterion was disabled it will be enabled

Exceptions
OutOfLowerBoundif timeout<=0.0 timeout is time in second (double).

Implements gum::IApproximationSchemeConfiguration.

Definition at line 931 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setMaxTime().

931  {
935  __Dag2BN.setMaxTime(timeout);
936  }
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setMaxTime(double timeout)
Stopping criterion on timeout.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ setMinEpsilonRate()

void gum::learning::genericBNLearner::setMinEpsilonRate ( double  rate)
inlinevirtual

Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|) If the criterion was disabled it will be enabled.

Exceptions
OutOfLowerBoundif rate<0

Implements gum::IApproximationSchemeConfiguration.

Definition at line 842 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setMinEpsilonRate().

842  {
847  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setMinEpsilonRate(double rate)
Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|).
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ setPeriodSize()

void gum::learning::genericBNLearner::setPeriodSize ( Size  p)
inlinevirtual

how many samples between 2 stopping isEnableds

Exceptions
OutOfLowerBoundif p<1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 980 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setPeriodSize().

980  {
985  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setPeriodSize(Size p)
How many samples between two stopping is enable.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ setSliceOrder() [1/2]

INLINE void gum::learning::genericBNLearner::setSliceOrder ( const NodeProperty< NodeId > &  slice_order)

sets a partial order on the nodes

Parameters
slice_ordera NodeProperty given the rank (priority) of nodes in the partial order

Definition at line 338 of file genericBNLearner_inl.h.

References __constraint_SliceOrder.

Referenced by setSliceOrder().

338  {
339  __constraint_SliceOrder = StructuralConstraintSliceOrder(slice_order);
340  }
StructuralConstraintSliceOrder __constraint_SliceOrder
the constraint for 2TBNs
+ Here is the caller graph for this function:

◆ setSliceOrder() [2/2]

INLINE void gum::learning::genericBNLearner::setSliceOrder ( const std::vector< std::vector< std::string > > &  slices)

sets a partial order on the nodes

Parameters
slicesthe list of list of variable names

Definition at line 342 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::idFromName(), gum::HashTable< Key, Val, Alloc >::insert(), and setSliceOrder().

343  {
344  NodeProperty< NodeId > slice_order;
345  NodeId rank = 0;
346  for (const auto& slice : slices) {
347  for (const auto& name : slice) {
348  slice_order.insert(idFromName(name), rank);
349  }
350  rank++;
351  }
352  setSliceOrder(slice_order);
353  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
void setSliceOrder(const NodeProperty< NodeId > &slice_order)
sets a partial order on the nodes
Size NodeId
Type for node ids.
Definition: graphElements.h:97
+ Here is the call graph for this function:

◆ setVerbosity()

void gum::learning::genericBNLearner::setVerbosity ( bool  v)
inlinevirtual

verbosity

Implements gum::IApproximationSchemeConfiguration.

Definition at line 997 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setVerbosity().

997  {
1002  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setVerbosity(bool v)
Set the verbosity on (true) or off (false).
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
DAG2BNLearner __Dag2BN
the parametric EM
+ Here is the call graph for this function:

◆ stateApproximationScheme()

ApproximationSchemeSTATE gum::learning::genericBNLearner::stateApproximationScheme ( ) const
inlinevirtual

history

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1015 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::stateApproximationScheme().

1015  {
1016  if (__current_algorithm != nullptr)
1018  else
1019  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1020  };
const ApproximationScheme * __current_algorithm
ApproximationSchemeSTATE stateApproximationScheme() const
Returns the approximation scheme state.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ use3off2()

INLINE void gum::learning::genericBNLearner::use3off2 ( )

indicate that we wish to use 3off2

Definition at line 172 of file genericBNLearner_inl.h.

References __miic_3off2, __selected_algo, MIIC_THREE_OFF_TWO, and gum::learning::Miic::set3off2Behaviour().

172  {
175  }
AlgoType __selected_algo
the selected learning algorithm
void set3off2Behaviour()
Sets the orientation phase to follow the one of the 3off2 algorithm.
Definition: Miic.cpp:1062
Miic __miic_3off2
the 3off2 algorithm
+ Here is the call graph for this function:

◆ useAprioriBDeu()

INLINE void gum::learning::genericBNLearner::useAprioriBDeu ( double  weight = 1)

use the BDeu apriori

The BDeu apriori adds weight/riqi to all the cells of the countings tables. In other words, it adds weight rows in the database with equally probable values.

Definition at line 399 of file genericBNLearner_inl.h.

References __apriori_type, BDEU, checkScoreAprioriCompatibility(), GUM_ERROR, and setAprioriWeight().

399  {
400  if (weight < 0) {
401  GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive");
402  }
403 
405  setAprioriWeight(weight);
406 
408  }
AprioriType __apriori_type
the a priori selected for the score and parameters
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
void setAprioriWeight(double weight)
sets the apriori weight
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ useAprioriDirichlet()

INLINE void gum::learning::genericBNLearner::useAprioriDirichlet ( const std::string &  filename,
double  weight = 1 
)

use the Dirichlet apriori

Definition at line 384 of file genericBNLearner_inl.h.

References __apriori_dbname, __apriori_type, checkScoreAprioriCompatibility(), DIRICHLET_FROM_DATABASE, GUM_ERROR, and setAprioriWeight().

385  {
386  if (weight < 0) {
387  GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive");
388  }
389 
390  __apriori_dbname = filename;
392  setAprioriWeight(weight);
393 
395  }
AprioriType __apriori_type
the a priori selected for the score and parameters
std::string __apriori_dbname
the filename for the Dirichlet a priori, if any
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
void setAprioriWeight(double weight)
sets the apriori weight
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ useAprioriSmoothing()

INLINE void gum::learning::genericBNLearner::useAprioriSmoothing ( double  weight = 1)

use the apriori smoothing

Parameters
weightpass in argument a weight if you wish to assign a weight to the smoothing, else the current weight of the genericBNLearner will be used.

Definition at line 372 of file genericBNLearner_inl.h.

References __apriori_type, checkScoreAprioriCompatibility(), GUM_ERROR, setAprioriWeight(), and SMOOTHING.

372  {
373  if (weight < 0) {
374  GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive");
375  }
376 
378  setAprioriWeight(weight);
379 
381  }
AprioriType __apriori_type
the a priori selected for the score and parameters
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
void setAprioriWeight(double weight)
sets the apriori weight
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ useCrossValidationFold()

std::pair< std::size_t, std::size_t > gum::learning::genericBNLearner::useCrossValidationFold ( const std::size_t  learning_fold,
const std::size_t  k_fold 
)

sets the ranges of rows to be used for cross-validation learning

When applied on (x,k), the method indicates to the subsequent learnings that they should be performed on the xth fold in a k-fold cross-validation context. For instance, if a database has 1000 rows, and if we perform a 10-fold cross-validation, then, the first learning fold (learning_fold=0) corresponds to rows interval [100,1000) and the test dataset corresponds to [0,100). The second learning fold (learning_fold=1) is [0,100) U [200,1000) and the corresponding test dataset is [100,200).

Parameters
learning_folda number indicating the set of rows used for learning. If N denotes the size of the database, and k_fold represents the number of folds in the cross validation, then the set of rows used for testing is [learning_fold * N / k_fold, (learning_fold+1) * N / k_fold) and the learning database is the complement in the database
k_foldthe value of "k" in k-fold cross validation
Returns
a pair [x,y) of rows' indices that corresponds to the indices of rows in the original database that constitute the test dataset
Exceptions
OutOfBoundsis raised if k_fold is equal to 0 or learning_fold is greater than or eqal to k_fold, or if k_fold is greater than or equal to the size of the database.

Definition at line 882 of file genericBNLearner.cpp.

References __ranges, __score_database, gum::learning::genericBNLearner::Database::databaseTable(), GUM_ERROR, and gum::learning::IDatabaseTable< T_DATA, ALLOC >::nbRows().

883  {
884  if (k_fold == 0) {
885  GUM_ERROR(OutOfBounds, "K-fold cross validation with k=0 is forbidden");
886  }
887 
888  if (learning_fold >= k_fold) {
889  GUM_ERROR(OutOfBounds,
890  "In " << k_fold << "-fold cross validation, the learning "
891  << "fold should be strictly lower than " << k_fold
892  << " but, here, it is equal to " << learning_fold);
893  }
894 
895  const std::size_t db_size = __score_database.databaseTable().nbRows();
896  if (k_fold >= db_size) {
897  GUM_ERROR(OutOfBounds,
898  "In " << k_fold << "-fold cross validation, the database's "
899  << "size should be strictly greater than " << k_fold
900  << " but, here, the database has only " << db_size
901  << "rows");
902  }
903 
904  // create the ranges of rows of the test database
905  const std::size_t foldSize = db_size / k_fold;
906  const std::size_t unfold_deb = learning_fold * foldSize;
907  const std::size_t unfold_end = unfold_deb + foldSize;
908 
909  __ranges.clear();
910  if (learning_fold == std::size_t(0)) {
911  __ranges.push_back(
912  std::pair< std::size_t, std::size_t >(unfold_end, db_size));
913  } else {
914  __ranges.push_back(
915  std::pair< std::size_t, std::size_t >(std::size_t(0), unfold_deb));
916 
917  if (learning_fold != k_fold - 1) {
918  __ranges.push_back(
919  std::pair< std::size_t, std::size_t >(unfold_end, db_size));
920  }
921  }
922 
923  return std::pair< std::size_t, std::size_t >(unfold_deb, unfold_end);
924  }
Database __score_database
the database to be used by the scores and parameter estimators
std::size_t nbRows() const noexcept
returns the number of records (rows) in the database
const DatabaseTable & databaseTable() const
returns the internal database table
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

◆ useDatabaseRanges()

template<template< typename > class XALLOC>
void gum::learning::genericBNLearner::useDatabaseRanges ( const std::vector< std::pair< std::size_t, std::size_t >, XALLOC< std::pair< std::size_t, std::size_t > > > &  new_ranges)

use a new set of database rows' ranges to perform learning

Parameters
rangesa set of pairs {(X1,Y1),...,(Xn,Yn)} of database's rows indices. The subsequent learnings are then performed only on the union of the rows [Xi,Yi), i in {1,...,n}. This is useful, e.g, when performing cross validation tasks, in which part of the database should be ignored. An empty set of ranges is equivalent to an interval [X,Y) ranging over the whole database.

Definition at line 100 of file genericBNLearner_tpl.h.

References __no_apriori, __ranges, __score_database, gum::learning::genericBNLearner::Database::parser(), and gum::learning::Score< ALLOC >::setRanges().

103  {
104  // use a score to detect whether the ranges are ok
105  ScoreLog2Likelihood<> score(__score_database.parser(), *__no_apriori);
106  score.setRanges(new_ranges);
107  __ranges = score.ranges();
108  }
Database __score_database
the database to be used by the scores and parameter estimators
std::vector< std::pair< std::size_t, std::size_t > > __ranges
the set of rows&#39; ranges within the database in which learning is done
DBRowGeneratorParser & parser()
returns the parser for the database
+ Here is the call graph for this function:

◆ useEM()

INLINE void gum::learning::genericBNLearner::useEM ( const double  epsilon)

use The EM algorithm to learn paramters

if epsilon=0, EM is not used

Definition at line 249 of file genericBNLearner_inl.h.

References __EMepsilon, and epsilon().

249  {
251  }
double __EMepsilon
epsilon for EM. if espilon=0.0 : no EM
double epsilon() const
Get the value of epsilon.
+ Here is the call graph for this function:

◆ useGreedyHillClimbing()

INLINE void gum::learning::genericBNLearner::useGreedyHillClimbing ( )

indicate that we wish to use a greedy hill climbing algorithm

Definition at line 236 of file genericBNLearner_inl.h.

References __selected_algo, and GREEDY_HILL_CLIMBING.

◆ useK2() [1/2]

INLINE void gum::learning::genericBNLearner::useK2 ( const Sequence< NodeId > &  order)

indicate that we wish to use K2

Definition at line 224 of file genericBNLearner_inl.h.

References __K2, __selected_algo, K2, and gum::learning::K2::setOrder().

224  {
226  __K2.setOrder(order);
227  }
AlgoType __selected_algo
the selected learning algorithm
void setOrder(const Sequence< NodeId > &order)
sets the order on the variables
+ Here is the call graph for this function:

◆ useK2() [2/2]

INLINE void gum::learning::genericBNLearner::useK2 ( const std::vector< NodeId > &  order)

indicate that we wish to use K2

Definition at line 230 of file genericBNLearner_inl.h.

References __K2, __selected_algo, K2, and gum::learning::K2::setOrder().

230  {
232  __K2.setOrder(order);
233  }
AlgoType __selected_algo
the selected learning algorithm
void setOrder(const Sequence< NodeId > &order)
sets the order on the variables
+ Here is the call graph for this function:

◆ useLocalSearchWithTabuList()

INLINE void gum::learning::genericBNLearner::useLocalSearchWithTabuList ( Size  tabu_size = 100,
Size  nb_decrease = 2 
)

indicate that we wish to use a local search with tabu list

Parameters
tabu_sizeindicate the size of the tabu list
nb_decreaseindicate the max number of changes decreasing the score consecutively that we allow to apply

Definition at line 241 of file genericBNLearner_inl.h.

References __constraint_TabuList, __local_search_with_tabu_list, __selected_algo, LOCAL_SEARCH_WITH_TABU_LIST, gum::learning::LocalSearchWithTabuList::setMaxNbDecreasingChanges(), and gum::learning::StructuralConstraintTabuList::setTabuListSize().

242  {
246  }
AlgoType __selected_algo
the selected learning algorithm
void setTabuListSize(Size new_size)
sets the size of the tabu list
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
StructuralConstraintTabuList __constraint_TabuList
the constraint for tabu lists
void setMaxNbDecreasingChanges(Size nb)
set the max number of changes decreasing the score that we allow to apply
+ Here is the call graph for this function:

◆ useMDL()

INLINE void gum::learning::genericBNLearner::useMDL ( )

indicate that we wish to use the MDL correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 194 of file genericBNLearner_inl.h.

References __3off2_kmode, __selected_algo, GUM_ERROR, and MIIC_THREE_OFF_TWO.

194  {
196  GUM_ERROR(OperationNotAllowed,
197  "You must use the 3off2 algorithm before selecting "
198  << "the MDL score");
199  }
201  }
AlgoType __selected_algo
the selected learning algorithm
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52

◆ useMIIC()

INLINE void gum::learning::genericBNLearner::useMIIC ( )

indicate that we wish to use MIIC

Definition at line 178 of file genericBNLearner_inl.h.

References __miic_3off2, __selected_algo, MIIC_THREE_OFF_TWO, and gum::learning::Miic::setMiicBehaviour().

178  {
181  }
AlgoType __selected_algo
the selected learning algorithm
Miic __miic_3off2
the 3off2 algorithm
void setMiicBehaviour()
Sets the orientation phase to follow the one of the MIIC algorithm.
Definition: Miic.cpp:1061
+ Here is the call graph for this function:

◆ useNML()

INLINE void gum::learning::genericBNLearner::useNML ( )

indicate that we wish to use the NML correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 184 of file genericBNLearner_inl.h.

References __3off2_kmode, __selected_algo, GUM_ERROR, and MIIC_THREE_OFF_TWO.

Referenced by __prepare_miic_3off2().

184  {
186  GUM_ERROR(OperationNotAllowed,
187  "You must use the 3off2 algorithm before selecting "
188  << "the NML score");
189  }
191  }
AlgoType __selected_algo
the selected learning algorithm
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the caller graph for this function:

◆ useNoApriori()

INLINE void gum::learning::genericBNLearner::useNoApriori ( )

use no apriori

Definition at line 366 of file genericBNLearner_inl.h.

References __apriori_type, checkScoreAprioriCompatibility(), and NO_APRIORI.

366  {
369  }
AprioriType __apriori_type
the a priori selected for the score and parameters
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useNoCorr()

INLINE void gum::learning::genericBNLearner::useNoCorr ( )

indicate that we wish to use the NoCorr correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 204 of file genericBNLearner_inl.h.

References __3off2_kmode, __selected_algo, GUM_ERROR, and MIIC_THREE_OFF_TWO.

204  {
206  GUM_ERROR(OperationNotAllowed,
207  "You must use the 3off2 algorithm before selecting "
208  << "the NoCorr score");
209  }
211  }
AlgoType __selected_algo
the selected learning algorithm
CorrectedMutualInformation ::KModeTypes __3off2_kmode
the penalty used in 3off2
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52

◆ useScoreAIC()

INLINE void gum::learning::genericBNLearner::useScoreAIC ( )

indicate that we wish to use an AIC score

Definition at line 131 of file genericBNLearner_inl.h.

References __score_type, AIC, and checkScoreAprioriCompatibility().

131  {
134  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning
+ Here is the call graph for this function:

◆ useScoreBD()

INLINE void gum::learning::genericBNLearner::useScoreBD ( )

indicate that we wish to use a BD score

Definition at line 137 of file genericBNLearner_inl.h.

References __score_type, BD, and checkScoreAprioriCompatibility().

137  {
140  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning
+ Here is the call graph for this function:

◆ useScoreBDeu()

INLINE void gum::learning::genericBNLearner::useScoreBDeu ( )

indicate that we wish to use a BDeu score

Definition at line 143 of file genericBNLearner_inl.h.

References __score_type, BDeu, and checkScoreAprioriCompatibility().

143  {
146  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning
+ Here is the call graph for this function:

◆ useScoreBIC()

INLINE void gum::learning::genericBNLearner::useScoreBIC ( )

indicate that we wish to use a BIC score

Definition at line 149 of file genericBNLearner_inl.h.

References __score_type, BIC, and checkScoreAprioriCompatibility().

149  {
152  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning
+ Here is the call graph for this function:

◆ useScoreK2()

INLINE void gum::learning::genericBNLearner::useScoreK2 ( )

indicate that we wish to use a K2 score

Definition at line 155 of file genericBNLearner_inl.h.

References __score_type, checkScoreAprioriCompatibility(), and K2.

155  {
158  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning
+ Here is the call graph for this function:

◆ useScoreLog2Likelihood()

INLINE void gum::learning::genericBNLearner::useScoreLog2Likelihood ( )

indicate that we wish to use a Log2Likelihood score

Definition at line 161 of file genericBNLearner_inl.h.

References __score_type, checkScoreAprioriCompatibility(), and LOG2LIKELIHOOD.

161  {
164  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning
+ Here is the call graph for this function:

◆ verbosity()

bool gum::learning::genericBNLearner::verbosity ( ) const
inlinevirtual

verbosity

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1004 of file genericBNLearner.h.

References __current_algorithm, GUM_ERROR, and gum::ApproximationScheme::verbosity().

1004  {
1005  if (__current_algorithm != nullptr)
1006  return __current_algorithm->verbosity();
1007  else
1008  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1009  };
const ApproximationScheme * __current_algorithm
bool verbosity() const
Returns true if verbosity is enabled.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:52
+ Here is the call graph for this function:

Member Data Documentation

◆ __3off2_kmode

CorrectedMutualInformation ::KModeTypes gum::learning::genericBNLearner::__3off2_kmode
protected
Initial value:

the penalty used in 3off2

Definition at line 696 of file genericBNLearner.h.

Referenced by __createCorrectedMutualInformation(), operator=(), useMDL(), useNML(), and useNoCorr().

◆ __apriori

Apriori* gum::learning::genericBNLearner::__apriori {nullptr}
protected

◆ __apriori_database

Database* gum::learning::genericBNLearner::__apriori_database {nullptr}
protected

the database used by the Dirichlet a priori

Definition at line 715 of file genericBNLearner.h.

Referenced by __createApriori(), __learnDAG(), operator=(), and ~genericBNLearner().

◆ __apriori_dbname

std::string gum::learning::genericBNLearner::__apriori_dbname
protected

the filename for the Dirichlet a priori, if any

Definition at line 718 of file genericBNLearner.h.

Referenced by __createApriori(), operator=(), and useAprioriDirichlet().

◆ __apriori_type

AprioriType gum::learning::genericBNLearner::__apriori_type {AprioriType::NO_APRIORI}
protected

the a priori selected for the score and parameters

Definition at line 661 of file genericBNLearner.h.

Referenced by __createApriori(), __getAprioriType(), __learnDAG(), operator=(), useAprioriBDeu(), useAprioriDirichlet(), useAprioriSmoothing(), and useNoApriori().

◆ __apriori_weight

double gum::learning::genericBNLearner::__apriori_weight {1.0f}
protected

the weight of the apriori

Definition at line 669 of file genericBNLearner.h.

Referenced by __createApriori(), checkScoreAprioriCompatibility(), operator=(), and setAprioriWeight().

◆ __constraint_ForbiddenArcs

StructuralConstraintForbiddenArcs gum::learning::genericBNLearner::__constraint_ForbiddenArcs
protected

the constraint on forbidden arcs

Definition at line 681 of file genericBNLearner.h.

Referenced by __learnDAG(), __prepare_miic_3off2(), addForbiddenArc(), eraseForbiddenArc(), operator=(), and setForbiddenArcs().

◆ __constraint_Indegree

StructuralConstraintIndegree gum::learning::genericBNLearner::__constraint_Indegree
protected

the constraint for indegrees

Definition at line 675 of file genericBNLearner.h.

Referenced by __learnDAG(), operator=(), and setMaxIndegree().

◆ __constraint_MandatoryArcs

StructuralConstraintMandatoryArcs gum::learning::genericBNLearner::__constraint_MandatoryArcs
protected

the constraint on forbidden arcs

Definition at line 684 of file genericBNLearner.h.

Referenced by __learnDAG(), __prepare_miic_3off2(), addMandatoryArc(), eraseMandatoryArc(), operator=(), and setMandatoryArcs().

◆ __constraint_SliceOrder

StructuralConstraintSliceOrder gum::learning::genericBNLearner::__constraint_SliceOrder
protected

the constraint for 2TBNs

Definition at line 672 of file genericBNLearner.h.

Referenced by __learnDAG(), operator=(), and setSliceOrder().

◆ __constraint_TabuList

StructuralConstraintTabuList gum::learning::genericBNLearner::__constraint_TabuList
protected

the constraint for tabu lists

Definition at line 678 of file genericBNLearner.h.

Referenced by __learnDAG(), operator=(), and useLocalSearchWithTabuList().

◆ __current_algorithm

◆ __Dag2BN

DAG2BNLearner gum::learning::genericBNLearner::__Dag2BN
protected

the parametric EM

Definition at line 700 of file genericBNLearner.h.

◆ __EMepsilon

double gum::learning::genericBNLearner::__EMepsilon {0.0}
protected

epsilon for EM. if espilon=0.0 : no EM

Definition at line 655 of file genericBNLearner.h.

Referenced by operator=(), and useEM().

◆ __greedy_hill_climbing

GreedyHillClimbing gum::learning::genericBNLearner::__greedy_hill_climbing
protected

the greedy hill climbing algorithm

Definition at line 703 of file genericBNLearner.h.

Referenced by __learnDAG(), and operator=().

◆ __initial_dag

DAG gum::learning::genericBNLearner::__initial_dag
protected

an initial DAG given to learners

Definition at line 721 of file genericBNLearner.h.

Referenced by __learnDAG(), operator=(), and setInitialDAG().

◆ __K2

K2 gum::learning::genericBNLearner::__K2
protected

the K2 algorithm

Definition at line 690 of file genericBNLearner.h.

Referenced by __learnDAG(), operator=(), and useK2().

◆ __local_search_with_tabu_list

LocalSearchWithTabuList gum::learning::genericBNLearner::__local_search_with_tabu_list
protected

the local search with tabu list algorithm

Definition at line 706 of file genericBNLearner.h.

Referenced by __learnDAG(), operator=(), and useLocalSearchWithTabuList().

◆ __miic_3off2

Miic gum::learning::genericBNLearner::__miic_3off2
protected

the 3off2 algorithm

Definition at line 693 of file genericBNLearner.h.

Referenced by __learnDAG(), __prepare_miic_3off2(), latentVariables(), learnMixedStructure(), operator=(), use3off2(), and useMIIC().

◆ __mutual_info

CorrectedMutualInformation* gum::learning::genericBNLearner::__mutual_info {nullptr}
protected

the selected correction for 3off2 and miic

Definition at line 658 of file genericBNLearner.h.

Referenced by __createCorrectedMutualInformation(), __learnDAG(), __prepare_miic_3off2(), learnMixedStructure(), operator=(), and ~genericBNLearner().

◆ __no_apriori

AprioriNoApriori* gum::learning::genericBNLearner::__no_apriori {nullptr}
protected

◆ __param_estimator_type

ParamEstimatorType gum::learning::genericBNLearner::__param_estimator_type {ParamEstimatorType::ML}
protected

the type of the parameter estimator

Definition at line 652 of file genericBNLearner.h.

Referenced by __createParamEstimator(), and operator=().

◆ __ranges

std::vector< std::pair< std::size_t, std::size_t > > gum::learning::genericBNLearner::__ranges
protected

the set of rows' ranges within the database in which learning is done

Definition at line 712 of file genericBNLearner.h.

Referenced by __createCorrectedMutualInformation(), __createParamEstimator(), __createScore(), clearDatabaseRanges(), databaseRanges(), operator=(), useCrossValidationFold(), and useDatabaseRanges().

◆ __score

Score* gum::learning::genericBNLearner::__score {nullptr}
protected

the score used

Definition at line 649 of file genericBNLearner.h.

Referenced by __createParamEstimator(), __createScore(), __learnDAG(), operator=(), and ~genericBNLearner().

◆ __score_database

◆ __score_type

ScoreType gum::learning::genericBNLearner::__score_type {ScoreType::BDeu}
protected

◆ __selected_algo

AlgoType gum::learning::genericBNLearner::__selected_algo {AlgoType::GREEDY_HILL_CLIMBING}
protected

◆ onProgress

◆ onStop

Signaler1< std::string > gum::IApproximationSchemeConfiguration::onStop
inherited

Criteria messageApproximationScheme.

Definition at line 60 of file IApproximationSchemeConfiguration.h.

Referenced by gum::ApproximationScheme::_stopScheme(), and distributeStop().


The documentation for this class was generated from the following files: