aGrUM  0.20.2
a C++ library for (probabilistic) graphical models
gum::learning::BNLearner< GUM_SCALAR > Class Template Reference

A pack of learning algorithms that can easily be used. More...

#include <BNLearner.h>

+ Inheritance diagram for gum::learning::BNLearner< GUM_SCALAR >:
+ Collaboration diagram for gum::learning::BNLearner< GUM_SCALAR >:

Public Attributes

Signaler3< Size, double, doubleonProgress
 Progression, error and time. More...
 
Signaler1< std::string > onStop
 Criteria messageApproximationScheme. More...
 

Public Member Functions

BayesNet< GUM_SCALAR > learnBN ()
 learn a Bayes Net from a file (must have read the db before) More...
 
BayesNet< GUM_SCALAR > learnParameters (const DAG &dag, bool take_into_account_score=true)
 learns a BN (its parameters) when its structure is known More...
 
BayesNet< GUM_SCALAR > learnParameters (bool take_into_account_score=true)
 
void setAprioriWeight__ (double weight)
 sets the apriori weight More...
 
void setMandatoryArcs (const ArcSet &set)
 assign a set of forbidden arcs More...
 
Constructors / Destructors
 BNLearner (const std::string &filename, const std::vector< std::string > &missing_symbols={"?"})
 default constructor More...
 
 BNLearner (const DatabaseTable<> &db)
 default constructor More...
 
 BNLearner (const std::string &filename, const gum::BayesNet< GUM_SCALAR > &src, const std::vector< std::string > &missing_symbols={"?"})
 Read the database file for the score / parameter estimation and var names. More...
 
 BNLearner (const BNLearner &)
 copy constructor More...
 
 BNLearner (BNLearner &&)
 move constructor More...
 
virtual ~BNLearner ()
 destructor More...
 
Operators
BNLearneroperator= (const BNLearner &)
 copy operator More...
 
BNLearneroperator= (BNLearner &&)
 move operator More...
 
Accessors / Modifiers
DAG learnDAG ()
 learn a structure from a file (must have read the db before) More...
 
MixedGraph learnMixedStructure ()
 learn a partial structure from a file (must have read the db before and must have selected miic or 3off2) More...
 
void setInitialDAG (const DAG &)
 sets an initial DAG structure More...
 
const std::vector< std::string > & names () const
 returns the names of the variables in the database More...
 
const std::vector< std::size_t > & domainSizes () const
 returns the domain sizes of the variables in the database More...
 
Size domainSize (NodeId var) const
 learn a structure from a file (must have read the db before) More...
 
Size domainSize (const std::string &var) const
 learn a structure from a file (must have read the db before) More...
 
NodeId idFromName (const std::string &var_name) const
 returns the node id corresponding to a variable name More...
 
const DatabaseTabledatabase () const
 returns the database used by the BNLearner More...
 
void setDatabaseWeight (const double new_weight)
 assign a weight to all the rows of the learning database so that the sum of their weights is equal to new_weight More...
 
void setRecordWeight (const std::size_t i, const double weight)
 sets the weight of the ith record of the database More...
 
double recordWeight (const std::size_t i) const
 returns the weight of the ith record More...
 
double databaseWeight () const
 returns the weight of the whole database More...
 
const std::string & nameFromId (NodeId id) const
 returns the variable name corresponding to a given node id More...
 
template<template< typename > class XALLOC>
void useDatabaseRanges (const std::vector< std::pair< std::size_t, std::size_t >, XALLOC< std::pair< std::size_t, std::size_t > > > &new_ranges)
 use a new set of database rows' ranges to perform learning More...
 
void clearDatabaseRanges ()
 reset the ranges to the one range corresponding to the whole database More...
 
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges () const
 returns the current database rows' ranges used for learning More...
 
std::pair< std::size_t, std::size_t > useCrossValidationFold (const std::size_t learning_fold, const std::size_t k_fold)
 sets the ranges of rows to be used for cross-validation learning More...
 
std::pair< double, doublechi2 (const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
 Return the <statistic,pvalue> pair for chi2 test in the database. More...
 
std::pair< double, doublechi2 (const std::string &name1, const std::string &name2, const std::vector< std::string > &knowing={})
 Return the <statistic,pvalue> pair for the BNLearner. More...
 
std::pair< double, doubleG2 (const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
 Return the <statistic,pvalue> pair for for G2 test in the database. More...
 
std::pair< double, doubleG2 (const std::string &name1, const std::string &name2, const std::vector< std::string > &knowing={})
 Return the <statistic,pvalue> pair for for G2 test in the database. More...
 
double logLikelihood (const std::vector< NodeId > &vars, const std::vector< NodeId > &knowing={})
 Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner. More...
 
double logLikelihood (const std::vector< std::string > &vars, const std::vector< std::string > &knowing={})
 Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner. More...
 
std::vector< doublerawPseudoCount (const std::vector< NodeId > &vars)
 Return the pseudoconts ofNodeIds vars in the base in a raw array. More...
 
std::vector< doublerawPseudoCount (const std::vector< std::string > &vars)
 Return the pseudoconts of vars in the base in a raw array. More...
 
Size nbCols () const
 
Size nbRows () const
 
void useEM (const double epsilon)
 use The EM algorithm to learn paramters More...
 
bool hasMissingValues () const
 returns true if the learner's database has missing values More...
 
Score selection
void useScoreAIC ()
 indicate that we wish to use an AIC score More...
 
void useScoreBD ()
 indicate that we wish to use a BD score More...
 
void useScoreBDeu ()
 indicate that we wish to use a BDeu score More...
 
void useScoreBIC ()
 indicate that we wish to use a BIC score More...
 
void useScoreK2 ()
 indicate that we wish to use a K2 score More...
 
void useScoreLog2Likelihood ()
 indicate that we wish to use a Log2Likelihood score More...
 
A priori selection / parameterization
void useNoApriori ()
 use no apriori More...
 
void useAprioriBDeu (double weight=1)
 use the BDeu apriori More...
 
void useAprioriSmoothing (double weight=1)
 use the apriori smoothing More...
 
void useAprioriDirichlet (const std::string &filename, double weight=1)
 use the Dirichlet apriori More...
 
std::string checkScoreAprioriCompatibility ()
 checks whether the current score and apriori are compatible More...
 
Learning algorithm selection
void useGreedyHillClimbing ()
 indicate that we wish to use a greedy hill climbing algorithm More...
 
void useLocalSearchWithTabuList (Size tabu_size=100, Size nb_decrease=2)
 indicate that we wish to use a local search with tabu list More...
 
void useK2 (const Sequence< NodeId > &order)
 indicate that we wish to use K2 More...
 
void useK2 (const std::vector< NodeId > &order)
 indicate that we wish to use K2 More...
 
void use3off2 ()
 indicate that we wish to use 3off2 More...
 
void useMIIC ()
 indicate that we wish to use MIIC More...
 
3off2/MIIC parameterization and specific results
void useNML ()
 indicate that we wish to use the NML correction for 3off2 More...
 
void useMDL ()
 indicate that we wish to use the MDL correction for 3off2 More...
 
void useNoCorr ()
 indicate that we wish to use the NoCorr correction for 3off2 More...
 
const std::vector< ArclatentVariables () const
 get the list of arcs hiding latent variables More...
 
Accessors / Modifiers for adding constraints on learning
void setMaxIndegree (Size max_indegree)
 sets the max indegree More...
 
void setSliceOrder (const NodeProperty< NodeId > &slice_order)
 sets a partial order on the nodes More...
 
void setSliceOrder (const std::vector< std::vector< std::string > > &slices)
 sets a partial order on the nodes More...
 
void setForbiddenArcs (const ArcSet &set)
 assign a set of forbidden arcs More...
 
assign a new forbidden arc
void addForbiddenArc (const Arc &arc)
 
void addForbiddenArc (const NodeId tail, const NodeId head)
 
void addForbiddenArc (const std::string &tail, const std::string &head)
 
void addMandatoryArc (const Arc &arc)
 
void addMandatoryArc (const NodeId tail, const NodeId head)
 
void addMandatoryArc (const std::string &tail, const std::string &head)
 
remove a forbidden arc
void eraseForbiddenArc (const Arc &arc)
 
void eraseForbiddenArc (const NodeId tail, const NodeId head)
 
void eraseForbiddenArc (const std::string &tail, const std::string &head)
 
void eraseMandatoryArc (const Arc &arc)
 
void eraseMandatoryArc (const NodeId tail, const NodeId head)
 
void eraseMandatoryArc (const std::string &tail, const std::string &head)
 
void setPossibleEdges (const EdgeSet &set)
 assign a set of forbidden edges More...
 
void setPossibleSkeleton (const UndiGraph &skeleton)
 assign a set of forbidden edges More...
 
assign a new possible edge
Warning
Once at least one possible edge is defined, all other edges are not possible anymore
void addPossibleEdge (const Edge &edge)
 
void addPossibleEdge (const NodeId tail, const NodeId head)
 
void addPossibleEdge (const std::string &tail, const std::string &head)
 
remove a possible edge
void erasePossibleEdge (const Edge &edge)
 
void erasePossibleEdge (const NodeId tail, const NodeId head)
 
void erasePossibleEdge (const std::string &tail, const std::string &head)
 
redistribute signals AND implemenation of interface
INLINE void setCurrentApproximationScheme (const ApproximationScheme *approximationScheme)
 {@ /// distribute signals More...
 
INLINE void distributeProgress (const ApproximationScheme *approximationScheme, Size pourcent, double error, double time)
 {@ /// distribute signals More...
 
INLINE void distributeStop (const ApproximationScheme *approximationScheme, std::string message)
 distribute signals More...
 
void setEpsilon (double eps)
 Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)| If the criterion was disabled it will be enabled. More...
 
double epsilon () const
 Get the value of epsilon. More...
 
void disableEpsilon ()
 Disable stopping criterion on epsilon. More...
 
void enableEpsilon ()
 Enable stopping criterion on epsilon. More...
 
bool isEnabledEpsilon () const
 
void setMinEpsilonRate (double rate)
 Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|) If the criterion was disabled it will be enabled. More...
 
double minEpsilonRate () const
 Get the value of the minimal epsilon rate. More...
 
void disableMinEpsilonRate ()
 Disable stopping criterion on epsilon rate. More...
 
void enableMinEpsilonRate ()
 Enable stopping criterion on epsilon rate. More...
 
bool isEnabledMinEpsilonRate () const
 
void setMaxIter (Size max)
 stopping criterion on number of iterationsIf the criterion was disabled it will be enabled More...
 
Size maxIter () const
 
void disableMaxIter ()
 Disable stopping criterion on max iterations. More...
 
void enableMaxIter ()
 Enable stopping criterion on max iterations. More...
 
bool isEnabledMaxIter () const
 
void setMaxTime (double timeout)
 stopping criterion on timeout If the criterion was disabled it will be enabled More...
 
double maxTime () const
 returns the timeout (in seconds) More...
 
double currentTime () const
 get the current running time in second (double) More...
 
void disableMaxTime ()
 Disable stopping criterion on timeout. More...
 
void enableMaxTime ()
 stopping criterion on timeout If the criterion was disabled it will be enabled More...
 
bool isEnabledMaxTime () const
 
void setPeriodSize (Size p)
 how many samples between 2 stopping isEnableds More...
 
Size periodSize () const
 how many samples between 2 stopping isEnableds More...
 
void setVerbosity (bool v)
 verbosity More...
 
bool verbosity () const
 verbosity More...
 
ApproximationSchemeSTATE stateApproximationScheme () const
 history More...
 
Size nbrIterations () const
 
const std::vector< double > & history () const
 
Getters and setters
std::string messageApproximationScheme () const
 Returns the approximation scheme message. More...
 

Public Types

enum  ScoreType {
  ScoreType::AIC, ScoreType::BD, ScoreType::BDeu, ScoreType::BIC,
  ScoreType::K2, ScoreType::LOG2LIKELIHOOD
}
 an enumeration enabling to select easily the score we wish to use More...
 
enum  ParamEstimatorType { ParamEstimatorType::ML }
 an enumeration to select the type of parameter estimation we shall apply More...
 
enum  AprioriType { AprioriType::NO_APRIORI, AprioriType::SMOOTHING, AprioriType::DIRICHLET_FROM_DATABASE, AprioriType::BDEU }
 an enumeration to select the apriori More...
 
enum  AlgoType { AlgoType::K2, AlgoType::GREEDY_HILL_CLIMBING, AlgoType::LOCAL_SEARCH_WITH_TABU_LIST, AlgoType::MIIC_THREE_OFF_TWO }
 an enumeration to select easily the learning algorithm to use More...
 
enum  ApproximationSchemeSTATE : char {
  ApproximationSchemeSTATE::Undefined, ApproximationSchemeSTATE::Continue, ApproximationSchemeSTATE::Epsilon, ApproximationSchemeSTATE::Rate,
  ApproximationSchemeSTATE::Limit, ApproximationSchemeSTATE::TimeLimit, ApproximationSchemeSTATE::Stopped
}
 The different state of an approximation scheme. More...
 

Protected Attributes

ScoreType score_type__ {ScoreType::BDeu}
 the score selected for learning More...
 
Scorescore__ {nullptr}
 the score used More...
 
ParamEstimatorType param_estimator_type__ {ParamEstimatorType::ML}
 the type of the parameter estimator More...
 
double EMepsilon__ {0.0}
 epsilon for EM. if espilon=0.0 : no EM More...
 
CorrectedMutualInformationmutual_info__ {nullptr}
 the selected correction for 3off2 and miic More...
 
AprioriType apriori_type__ {AprioriType::NO_APRIORI}
 the a priori selected for the score and parameters More...
 
Aprioriapriori__ {nullptr}
 the apriori used More...
 
AprioriNoApriorino_apriori__ {nullptr}
 
double apriori_weight__ {1.0f}
 the weight of the apriori More...
 
StructuralConstraintSliceOrder constraint_SliceOrder__
 the constraint for 2TBNs More...
 
StructuralConstraintIndegree constraint_Indegree__
 the constraint for indegrees More...
 
StructuralConstraintTabuList constraint_TabuList__
 the constraint for tabu lists More...
 
StructuralConstraintForbiddenArcs constraint_ForbiddenArcs__
 the constraint on forbidden arcs More...
 
StructuralConstraintPossibleEdges constraint_PossibleEdges__
 the constraint on possible Edges More...
 
StructuralConstraintMandatoryArcs constraint_MandatoryArcs__
 the constraint on forbidden arcs More...
 
AlgoType selected_algo__ {AlgoType::GREEDY_HILL_CLIMBING}
 the selected learning algorithm More...
 
K2 K2__
 the K2 algorithm More...
 
Miic miic_3off2__
 the 3off2 algorithm More...
 
CorrectedMutualInformation ::KModeTypes kmode_3off2__
 the penalty used in 3off2 More...
 
DAG2BNLearner Dag2BN__
 the parametric EM More...
 
GreedyHillClimbing greedy_hill_climbing__
 the greedy hill climbing algorithm More...
 
LocalSearchWithTabuList local_search_with_tabu_list__
 the local search with tabu list algorithm More...
 
Database score_database__
 the database to be used by the scores and parameter estimators More...
 
std::vector< std::pair< std::size_t, std::size_t > > ranges__
 the set of rows' ranges within the database in which learning is done More...
 
Databaseapriori_database__ {nullptr}
 the database used by the Dirichlet a priori More...
 
std::string apriori_dbname__
 the filename for the Dirichlet a priori, if any More...
 
DAG initial_dag__
 an initial DAG given to learners More...
 
const ApproximationSchemecurrent_algorithm__ {nullptr}
 

Protected Member Functions

void createApriori__ ()
 create the apriori used for learning More...
 
void createScore__ ()
 create the score used for learning More...
 
ParamEstimatorcreateParamEstimator__ (DBRowGeneratorParser<> &parser, bool take_into_account_score=true)
 create the parameter estimator used for learning More...
 
DAG learnDAG__ ()
 returns the DAG learnt More...
 
MixedGraph prepare_miic_3off2__ ()
 prepares the initial graph for 3off2 or miic More...
 
const std::string & getAprioriType__ () const
 returns the type (as a string) of a given apriori More...
 
void createCorrectedMutualInformation__ ()
 create the Corrected Mutual Information instance for Miic/3off2 More...
 

Static Protected Member Functions

static DatabaseTable readFile__ (const std::string &filename, const std::vector< std::string > &missing_symbols)
 reads a file and returns a databaseVectInRam More...
 
static void checkFileName__ (const std::string &filename)
 checks whether the extension of a CSV filename is correct More...
 

Detailed Description

template<typename GUM_SCALAR>
class gum::learning::BNLearner< GUM_SCALAR >

A pack of learning algorithms that can easily be used.

The pack currently contains K2, GreedyHillClimbing and LocalSearchWithTabuList

Definition at line 59 of file BNLearner.h.

Member Enumeration Documentation

◆ AlgoType

an enumeration to select easily the learning algorithm to use

Enumerator
K2 
GREEDY_HILL_CLIMBING 
LOCAL_SEARCH_WITH_TABU_LIST 
MIIC_THREE_OFF_TWO 

Definition at line 135 of file genericBNLearner.h.

136  {
137  K2,
138  GREEDY_HILL_CLIMBING,
139  LOCAL_SEARCH_WITH_TABU_LIST,
140  MIIC_THREE_OFF_TWO
141  };

◆ ApproximationSchemeSTATE

The different state of an approximation scheme.

Enumerator
Undefined 
Continue 
Epsilon 
Rate 
Limit 
TimeLimit 
Stopped 

Definition at line 64 of file IApproximationSchemeConfiguration.h.

64  : char
65  {
66  Undefined,
67  Continue,
68  Epsilon,
69  Rate,
70  Limit,
71  TimeLimit,
72  Stopped
73  };

◆ AprioriType

an enumeration to select the apriori

Enumerator
NO_APRIORI 
SMOOTHING 
DIRICHLET_FROM_DATABASE 
BDEU 

Definition at line 126 of file genericBNLearner.h.

127  {
128  NO_APRIORI,
129  SMOOTHING,
130  DIRICHLET_FROM_DATABASE,
131  BDEU
132  };

◆ ParamEstimatorType

an enumeration to select the type of parameter estimation we shall apply

Enumerator
ML 

Definition at line 122 of file genericBNLearner.h.

123  { ML };

◆ ScoreType

an enumeration enabling to select easily the score we wish to use

Enumerator
AIC 
BD 
BDeu 
BIC 
K2 
LOG2LIKELIHOOD 

Definition at line 110 of file genericBNLearner.h.

111  {
112  AIC,
113  BD,
114  BDeu,
115  BIC,
116  K2,
117  LOG2LIKELIHOOD
118  };

Constructor & Destructor Documentation

◆ BNLearner() [1/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const std::string &  filename,
const std::vector< std::string > &  missing_symbols = {"?"} 
)

default constructor

read the database file for the score / parameter estimation and var names

◆ BNLearner() [2/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const DatabaseTable<> &  db)

default constructor

read the database file for the score / parameter estimation and var names

◆ BNLearner() [3/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const std::string &  filename,
const gum::BayesNet< GUM_SCALAR > &  src,
const std::vector< std::string > &  missing_symbols = {"?"} 
)

Read the database file for the score / parameter estimation and var names.

If modalities = { 1 -> {True, False, Big} }, then the node of id 1 in the BN will have 3 modalities, the first one being True, the second one being False, and the third bein Big.

A parsing of the database will allow to determine which ones are really necessary and will keep them in the order specified by the user (NodeProperty modalities). If parse_database is set to false (the default), then the modalities specified by the user will be considered as being exactly those of the variables of the BN (as a consequence, if we find other values in the database, an exception will be raised during learning).

Parameters
filenameThe file to learn from.
modalitiesindicate for some nodes (not necessarily all the nodes of the BN) which modalities they should have and in which order these modalities should be stored into the nodes.
parse_databaseif true, the modalities specified by the user will be considered as a superset of the modalities of the variables. Wrapper for BNLearner (filename,modalities,parse_database) using a bn to find those modalities and nodeids.

◆ BNLearner() [4/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const BNLearner< GUM_SCALAR > &  )

copy constructor

◆ BNLearner() [5/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( BNLearner< GUM_SCALAR > &&  )

move constructor

◆ ~BNLearner()

template<typename GUM_SCALAR >
virtual gum::learning::BNLearner< GUM_SCALAR >::~BNLearner ( )
virtual

destructor

Member Function Documentation

◆ addForbiddenArc() [1/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const Arc arc)
inherited

Definition at line 358 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

358  {
360  }
void addArc(const Arc &arc)
assign a new forbidden arc
StructuralConstraintForbiddenArcs constraint_ForbiddenArcs__
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ addForbiddenArc() [2/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 368 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

369  {
370  addForbiddenArc(Arc(tail, head));
371  }
+ Here is the call graph for this function:

◆ addForbiddenArc() [3/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 380 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

381  {
382  addForbiddenArc(Arc(idFromName(tail), idFromName(head)));
383  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ addMandatoryArc() [1/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const Arc arc)
inherited

Definition at line 397 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

397  {
399  }
void addArc(const Arc &arc)
assign a new forbidden arc
StructuralConstraintMandatoryArcs constraint_MandatoryArcs__
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ addMandatoryArc() [2/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 419 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

420  {
421  addMandatoryArc(Arc(tail, head));
422  }
+ Here is the call graph for this function:

◆ addMandatoryArc() [3/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 407 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

408  {
409  addMandatoryArc(Arc(idFromName(tail), idFromName(head)));
410  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ addPossibleEdge() [1/3]

INLINE void gum::learning::genericBNLearner::addPossibleEdge ( const Edge edge)
inherited

Definition at line 319 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

319  {
321  }
StructuralConstraintPossibleEdges constraint_PossibleEdges__
the constraint on possible Edges
void addEdge(const Edge &edge)
assign a new forbidden arc
+ Here is the call graph for this function:

◆ addPossibleEdge() [2/3]

INLINE void gum::learning::genericBNLearner::addPossibleEdge ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 329 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

330  {
331  addPossibleEdge(Edge(tail, head));
332  }
void addPossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ addPossibleEdge() [3/3]

INLINE void gum::learning::genericBNLearner::addPossibleEdge ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 341 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

342  {
343  addPossibleEdge(Edge(idFromName(tail), idFromName(head)));
344  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
void addPossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ checkFileName__()

void gum::learning::genericBNLearner::checkFileName__ ( const std::string &  filename)
staticprotectedinherited

checks whether the extension of a CSV filename is correct

Definition at line 414 of file genericBNLearner.cpp.

414  {
415  // get the extension of the file
416  Size filename_size = Size(filename.size());
417 
418  if (filename_size < 4) {
419  GUM_ERROR(FormatNotFound,
420  "genericBNLearner could not determine the "
421  "file type of the database");
422  }
423 
424  std::string extension = filename.substr(filename.size() - 4);
425  std::transform(extension.begin(),
426  extension.end(),
427  extension.begin(),
428  ::tolower);
429 
430  if (extension != ".csv") {
431  GUM_ERROR(
432  OperationNotAllowed,
433  "genericBNLearner does not support yet this type of database file");
434  }
435  }
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:47
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ checkScoreAprioriCompatibility()

std::string gum::learning::genericBNLearner::checkScoreAprioriCompatibility ( )
inherited

checks whether the current score and apriori are compatible

Returns
a non empty string if the apriori is somehow compatible with the score.

Definition at line 871 of file genericBNLearner.cpp.

871  {
872  const std::string& apriori = getAprioriType__();
873 
874  switch (score_type__) {
875  case ScoreType::AIC:
877 
878  case ScoreType::BD:
880 
881  case ScoreType::BDeu:
883 
884  case ScoreType::BIC:
886 
887  case ScoreType::K2:
889 
893 
894  default:
895  return "genericBNLearner does not support yet this score";
896  }
897  }
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
double apriori_weight__
the weight of the apriori
ScoreType score_type__
the score selected for learning
const std::string & getAprioriType__() const
returns the type (as a string) of a given apriori
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score

◆ chi2() [1/2]

std::pair< double, double > gum::learning::genericBNLearner::chi2 ( const NodeId  id1,
const NodeId  id2,
const std::vector< NodeId > &  knowing = {} 
)
inherited

Return the <statistic,pvalue> pair for chi2 test in the database.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 948 of file genericBNLearner.cpp.

950  {
951  createApriori__();
953  *apriori__,
954  databaseRanges());
955 
956  return chi2score.statistics(id1, id2, knowing);
957  }
Database score_database__
the database to be used by the scores and parameter estimators
the class for computing Chi2 independence test scores
Definition: indepTestChi2.h:47
std::pair< double, double > statistics(NodeId var1, NodeId var2, const std::vector< NodeId, ALLOC< NodeId > > &rhs_ids={})
get the pair <chi2 statistic,pvalue> for a test var1 indep var2 given rhs_ids
void createApriori__()
create the apriori used for learning
Apriori * apriori__
the apriori used
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
DBRowGeneratorParser & parser()
returns the parser for the database

◆ chi2() [2/2]

std::pair< double, double > gum::learning::genericBNLearner::chi2 ( const std::string &  name1,
const std::string &  name2,
const std::vector< std::string > &  knowing = {} 
)
inherited

Return the <statistic,pvalue> pair for the BNLearner.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 960 of file genericBNLearner.cpp.

962  {
963  std::vector< NodeId > knowingIds;
964  std::transform(
965  knowing.begin(),
966  knowing.end(),
967  std::back_inserter(knowingIds),
968  [this](const std::string& c) -> NodeId { return this->idFromName(c); });
969  return chi2(idFromName(name1), idFromName(name2), knowingIds);
970  }
std::pair< double, double > chi2(const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
Return the <statistic,pvalue> pair for chi2 test in the database.
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Size NodeId
Type for node ids.
Definition: graphElements.h:97

◆ clearDatabaseRanges()

INLINE void gum::learning::genericBNLearner::clearDatabaseRanges ( )
inherited

reset the ranges to the one range corresponding to the whole database

Definition at line 554 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

554 { ranges__.clear(); }
std::vector< std::pair< std::size_t, std::size_t > > ranges__
the set of rows&#39; ranges within the database in which learning is done
+ Here is the call graph for this function:

◆ createApriori__()

void gum::learning::genericBNLearner::createApriori__ ( )
protectedinherited

create the apriori used for learning

Definition at line 465 of file genericBNLearner.cpp.

465  {
466  // first, save the old apriori, to be delete if everything is ok
467  Apriori<>* old_apriori = apriori__;
468 
469  // create the new apriori
470  switch (apriori_type__) {
472  apriori__ = new AprioriNoApriori<>(score_database__.databaseTable(),
474  break;
475 
477  apriori__ = new AprioriSmoothing<>(score_database__.databaseTable(),
479  break;
480 
482  if (apriori_database__ != nullptr) {
483  delete apriori_database__;
484  apriori_database__ = nullptr;
485  }
486 
487  apriori_database__ = new Database(apriori_dbname__,
490 
491  apriori__ = new AprioriDirichletFromDatabase<>(
495  break;
496 
497  case AprioriType::BDEU:
498  apriori__ = new AprioriBDeu<>(score_database__.databaseTable(),
500  break;
501 
502  default:
503  GUM_ERROR(OperationNotAllowed,
504  "The BNLearner does not support yet this apriori");
505  }
506 
507  // do not forget to assign a weight to the apriori
508  apriori__->setWeight(apriori_weight__);
509 
510  // remove the old apriori, if any
511  if (old_apriori != nullptr) delete old_apriori;
512  }
Database score_database__
the database to be used by the scores and parameter estimators
const std::vector< std::string > & missingSymbols() const
returns the set of missing symbols taken into account
double apriori_weight__
the weight of the apriori
const DatabaseTable & databaseTable() const
returns the internal database table
Database * apriori_database__
the database used by the Dirichlet a priori
Apriori * apriori__
the apriori used
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
std::string apriori_dbname__
the filename for the Dirichlet a priori, if any
DBRowGeneratorParser & parser()
returns the parser for the database
AprioriType apriori_type__
the a priori selected for the score and parameters
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ createCorrectedMutualInformation__()

void gum::learning::genericBNLearner::createCorrectedMutualInformation__ ( )
protectedinherited

create the Corrected Mutual Information instance for Miic/3off2

Definition at line 666 of file genericBNLearner.cpp.

666  {
667  if (mutual_info__ != nullptr) delete mutual_info__;
668 
670  = new CorrectedMutualInformation<>(score_database__.parser(),
671  *no_apriori__,
672  ranges__,
674  switch (kmode_3off2__) {
677  break;
678 
681  break;
682 
685  break;
686 
687  default:
688  GUM_ERROR(NotImplementedYet,
689  "The BNLearner's corrected mutual information class does "
690  << "not support yet penalty mode " << int(kmode_3off2__));
691  }
692  }
void useNML()
use the kNML penalty function
Database score_database__
the database to be used by the scores and parameter estimators
std::vector< std::pair< std::size_t, std::size_t > > ranges__
the set of rows&#39; ranges within the database in which learning is done
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
CorrectedMutualInformation ::KModeTypes kmode_3off2__
the penalty used in 3off2
CorrectedMutualInformation * mutual_info__
the selected correction for 3off2 and miic
DBRowGeneratorParser & parser()
returns the parser for the database
void useMDL()
use the MDL penalty function
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54
void useNoCorr()
use no correction/penalty function

◆ createParamEstimator__()

ParamEstimator * gum::learning::genericBNLearner::createParamEstimator__ ( DBRowGeneratorParser<> &  parser,
bool  take_into_account_score = true 
)
protectedinherited

create the parameter estimator used for learning

Definition at line 572 of file genericBNLearner.cpp.

573  {
574  ParamEstimator<>* param_estimator = nullptr;
575 
576  // create the new estimator
577  switch (param_estimator_type__) {
579  if (take_into_account_score && (score__ != nullptr)) {
580  param_estimator
581  = new ParamEstimatorML<>(parser,
582  *apriori__,
584  ranges__,
586  } else {
587  param_estimator
588  = new ParamEstimatorML<>(parser,
589  *apriori__,
590  *no_apriori__,
591  ranges__,
593  }
594 
595  break;
596 
597  default:
598  GUM_ERROR(OperationNotAllowed,
599  "genericBNLearner does not support "
600  << "yet this parameter estimator");
601  }
602 
603  // assign the set of ranges
604  param_estimator->setRanges(ranges__);
605 
606  return param_estimator;
607  }
Database score_database__
the database to be used by the scores and parameter estimators
std::vector< std::pair< std::size_t, std::size_t > > ranges__
the set of rows&#39; ranges within the database in which learning is done
Score * score__
the score used
virtual const Apriori< ALLOC > & internalApriori() const =0
returns the internal apriori of the score
Apriori * apriori__
the apriori used
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
ParamEstimatorType param_estimator_type__
the type of the parameter estimator
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ createScore__()

void gum::learning::genericBNLearner::createScore__ ( )
protectedinherited

create the score used for learning

Definition at line 514 of file genericBNLearner.cpp.

514  {
515  // first, save the old score, to be delete if everything is ok
516  Score<>* old_score = score__;
517 
518  // create the new scoring function
519  switch (score_type__) {
520  case ScoreType::AIC:
521  score__ = new ScoreAIC<>(score_database__.parser(),
522  *apriori__,
523  ranges__,
525  break;
526 
527  case ScoreType::BD:
528  score__ = new ScoreBD<>(score_database__.parser(),
529  *apriori__,
530  ranges__,
532  break;
533 
534  case ScoreType::BDeu:
535  score__ = new ScoreBDeu<>(score_database__.parser(),
536  *apriori__,
537  ranges__,
539  break;
540 
541  case ScoreType::BIC:
542  score__ = new ScoreBIC<>(score_database__.parser(),
543  *apriori__,
544  ranges__,
546  break;
547 
548  case ScoreType::K2:
549  score__ = new ScoreK2<>(score_database__.parser(),
550  *apriori__,
551  ranges__,
553  break;
554 
556  score__ = new ScoreLog2Likelihood<>(score_database__.parser(),
557  *apriori__,
558  ranges__,
560  break;
561 
562  default:
563  GUM_ERROR(OperationNotAllowed,
564  "genericBNLearner does not support yet this score");
565  }
566 
567  // remove the old score, if any
568  if (old_score != nullptr) delete old_score;
569  }
Database score_database__
the database to be used by the scores and parameter estimators
std::vector< std::pair< std::size_t, std::size_t > > ranges__
the set of rows&#39; ranges within the database in which learning is done
Score * score__
the score used
ScoreType score_type__
the score selected for learning
Apriori * apriori__
the apriori used
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
DBRowGeneratorParser & parser()
returns the parser for the database
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ currentTime()

double gum::learning::genericBNLearner::currentTime ( ) const
inlinevirtualinherited

get the current running time in second (double)

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1059 of file genericBNLearner.h.

1059  {
1060  if (current_algorithm__ != nullptr)
1061  return current_algorithm__->currentTime();
1062  else
1063  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1064  };
const ApproximationScheme * current_algorithm__
double currentTime() const
Returns the current running time in second.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ database()

INLINE const DatabaseTable & gum::learning::genericBNLearner::database ( ) const
inherited

returns the database used by the BNLearner

Definition at line 557 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

557  {
559  }
Database score_database__
the database to be used by the scores and parameter estimators
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:

◆ databaseRanges()

INLINE const std::vector< std::pair< std::size_t, std::size_t > > & gum::learning::genericBNLearner::databaseRanges ( ) const
inherited

returns the current database rows' ranges used for learning

Returns
The method returns a vector of pairs [Xi,Yi) of indices of rows in the database. The learning is performed on these set of rows.
Warning
an empty set of ranges means the whole database.

Definition at line 549 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

549  {
550  return ranges__;
551  }
std::vector< std::pair< std::size_t, std::size_t > > ranges__
the set of rows&#39; ranges within the database in which learning is done
+ Here is the call graph for this function:

◆ databaseWeight()

INLINE double gum::learning::genericBNLearner::databaseWeight ( ) const
inherited

returns the weight of the whole database

Definition at line 172 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

172  {
173  return score_database__.weight();
174  }
Database score_database__
the database to be used by the scores and parameter estimators
double weight(const std::size_t i) const
returns the weight of the ith record
+ Here is the call graph for this function:

◆ disableEpsilon()

void gum::learning::genericBNLearner::disableEpsilon ( )
inlinevirtualinherited

Disable stopping criterion on epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 924 of file genericBNLearner.h.

924  {
929  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void disableEpsilon()
Disable stopping criterion on epsilon.
DAG2BNLearner Dag2BN__
the parametric EM
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm

◆ disableMaxIter()

void gum::learning::genericBNLearner::disableMaxIter ( )
inlinevirtualinherited

Disable stopping criterion on max iterations.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1014 of file genericBNLearner.h.

1014  {
1019  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
DAG2BNLearner Dag2BN__
the parametric EM
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
void disableMaxIter()
Disable stopping criterion on max iterations.
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm

◆ disableMaxTime()

void gum::learning::genericBNLearner::disableMaxTime ( )
inlinevirtualinherited

Disable stopping criterion on timeout.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1067 of file genericBNLearner.h.

1067  {
1072  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void disableMaxTime()
Disable stopping criterion on timeout.
DAG2BNLearner Dag2BN__
the parametric EM
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm

◆ disableMinEpsilonRate()

void gum::learning::genericBNLearner::disableMinEpsilonRate ( )
inlinevirtualinherited

Disable stopping criterion on epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 970 of file genericBNLearner.h.

970  {
975  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void disableMinEpsilonRate()
Disable stopping criterion on epsilon rate.
DAG2BNLearner Dag2BN__
the parametric EM
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm

◆ distributeProgress()

INLINE void gum::learning::genericBNLearner::distributeProgress ( const ApproximationScheme approximationScheme,
Size  pourcent,
double  error,
double  time 
)
inlineinherited

{@ /// distribute signals

Definition at line 886 of file genericBNLearner.h.

889  {
890  setCurrentApproximationScheme(approximationScheme);
891 
892  if (onProgress.hasListener()) GUM_EMIT3(onProgress, pourcent, error, time);
893  };
INLINE void setCurrentApproximationScheme(const ApproximationScheme *approximationScheme)
{@ /// distribute signals
Signaler3< Size, double, double > onProgress
Progression, error and time.
#define GUM_EMIT3(signal, arg1, arg2, arg3)
Definition: signaler3.h:41

◆ distributeStop()

INLINE void gum::learning::genericBNLearner::distributeStop ( const ApproximationScheme approximationScheme,
std::string  message 
)
inlineinherited

distribute signals

Definition at line 896 of file genericBNLearner.h.

897  {
898  setCurrentApproximationScheme(approximationScheme);
899 
900  if (onStop.hasListener()) GUM_EMIT1(onStop, message);
901  };
INLINE void setCurrentApproximationScheme(const ApproximationScheme *approximationScheme)
{@ /// distribute signals
#define GUM_EMIT1(signal, arg1)
Definition: signaler1.h:41
Signaler1< std::string > onStop
Criteria messageApproximationScheme.

◆ domainSize() [1/2]

INLINE Size gum::learning::genericBNLearner::domainSize ( NodeId  var) const
inherited

learn a structure from a file (must have read the db before)

Definition at line 539 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

539  {
540  return score_database__.domainSizes()[var];
541  }
Database score_database__
the database to be used by the scores and parameter estimators
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
+ Here is the call graph for this function:

◆ domainSize() [2/2]

INLINE Size gum::learning::genericBNLearner::domainSize ( const std::string &  var) const
inherited

learn a structure from a file (must have read the db before)

Definition at line 543 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

543  {
544  return score_database__.domainSizes()[idFromName(var)];
545  }
Database score_database__
the database to be used by the scores and parameter estimators
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ domainSizes()

INLINE const std::vector< std::size_t > & gum::learning::genericBNLearner::domainSizes ( ) const
inherited

returns the domain sizes of the variables in the database

Definition at line 534 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

534  {
535  return score_database__.domainSizes();
536  }
Database score_database__
the database to be used by the scores and parameter estimators
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
+ Here is the call graph for this function:

◆ enableEpsilon()

void gum::learning::genericBNLearner::enableEpsilon ( )
inlinevirtualinherited

Enable stopping criterion on epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 932 of file genericBNLearner.h.

932  {
937  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
DAG2BNLearner Dag2BN__
the parametric EM
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm
void enableEpsilon()
Enable stopping criterion on epsilon.

◆ enableMaxIter()

void gum::learning::genericBNLearner::enableMaxIter ( )
inlinevirtualinherited

Enable stopping criterion on max iterations.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1021 of file genericBNLearner.h.

1021  {
1026  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
DAG2BNLearner Dag2BN__
the parametric EM
void enableMaxIter()
Enable stopping criterion on max iterations.
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm

◆ enableMaxTime()

void gum::learning::genericBNLearner::enableMaxTime ( )
inlinevirtualinherited

stopping criterion on timeout If the criterion was disabled it will be enabled

Exceptions
OutOfLowerBoundif timeout<=0.0 timeout is time in second (double).

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1073 of file genericBNLearner.h.

1073  {
1078  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
DAG2BNLearner Dag2BN__
the parametric EM
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm
void enableMaxTime()
Enable stopping criterion on timeout.

◆ enableMinEpsilonRate()

void gum::learning::genericBNLearner::enableMinEpsilonRate ( )
inlinevirtualinherited

Enable stopping criterion on epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 977 of file genericBNLearner.h.

977  {
982  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void enableMinEpsilonRate()
Enable stopping criterion on epsilon rate.
DAG2BNLearner Dag2BN__
the parametric EM
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm

◆ epsilon()

double gum::learning::genericBNLearner::epsilon ( ) const
inlinevirtualinherited

Get the value of epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 916 of file genericBNLearner.h.

916  {
917  if (current_algorithm__ != nullptr)
918  return current_algorithm__->epsilon();
919  else
920  GUM_ERROR(FatalError, "No chosen algorithm for learning");
921  };
const ApproximationScheme * current_algorithm__
double epsilon() const
Returns the value of epsilon.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ eraseForbiddenArc() [1/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const Arc arc)
inherited

Definition at line 363 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

363  {
365  }
void eraseArc(const Arc &arc)
remove a forbidden arc
StructuralConstraintForbiddenArcs constraint_ForbiddenArcs__
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ eraseForbiddenArc() [2/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 374 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

375  {
376  eraseForbiddenArc(Arc(tail, head));
377  }
+ Here is the call graph for this function:

◆ eraseForbiddenArc() [3/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 386 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

387  {
388  eraseForbiddenArc(Arc(idFromName(tail), idFromName(head)));
389  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ eraseMandatoryArc() [1/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const Arc arc)
inherited

Definition at line 402 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

402  {
404  }
StructuralConstraintMandatoryArcs constraint_MandatoryArcs__
the constraint on forbidden arcs
void eraseArc(const Arc &arc)
remove a forbidden arc
+ Here is the call graph for this function:

◆ eraseMandatoryArc() [2/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 425 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

426  {
427  eraseMandatoryArc(Arc(tail, head));
428  }
+ Here is the call graph for this function:

◆ eraseMandatoryArc() [3/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 413 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

414  {
415  eraseMandatoryArc(Arc(idFromName(tail), idFromName(head)));
416  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ erasePossibleEdge() [1/3]

INLINE void gum::learning::genericBNLearner::erasePossibleEdge ( const Edge edge)
inherited

Definition at line 324 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

324  {
326  }
StructuralConstraintPossibleEdges constraint_PossibleEdges__
the constraint on possible Edges
void eraseEdge(const Edge &edge)
remove a forbidden arc
+ Here is the call graph for this function:

◆ erasePossibleEdge() [2/3]

INLINE void gum::learning::genericBNLearner::erasePossibleEdge ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 335 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

336  {
337  erasePossibleEdge(Edge(tail, head));
338  }
void erasePossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ erasePossibleEdge() [3/3]

INLINE void gum::learning::genericBNLearner::erasePossibleEdge ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 347 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

348  {
349  erasePossibleEdge(Edge(idFromName(tail), idFromName(head)));
350  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
void erasePossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ G2() [1/2]

std::pair< double, double > gum::learning::genericBNLearner::G2 ( const NodeId  id1,
const NodeId  id2,
const std::vector< NodeId > &  knowing = {} 
)
inherited

Return the <statistic,pvalue> pair for for G2 test in the database.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 973 of file genericBNLearner.cpp.

975  {
976  createApriori__();
978  *apriori__,
979  databaseRanges());
980  return g2score.statistics(id1, id2, knowing);
981  }
Database score_database__
the database to be used by the scores and parameter estimators
the class for computing G2 independence test scores
Definition: indepTestG2.h:47
std::pair< double, double > statistics(NodeId var1, NodeId var2, const std::vector< NodeId, ALLOC< NodeId > > &rhs_ids={})
get the pair <G2statistic,pvalue> for a test var1 indep var2 given rhs_ids
void createApriori__()
create the apriori used for learning
Apriori * apriori__
the apriori used
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
DBRowGeneratorParser & parser()
returns the parser for the database

◆ G2() [2/2]

std::pair< double, double > gum::learning::genericBNLearner::G2 ( const std::string &  name1,
const std::string &  name2,
const std::vector< std::string > &  knowing = {} 
)
inherited

Return the <statistic,pvalue> pair for for G2 test in the database.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 984 of file genericBNLearner.cpp.

986  {
987  std::vector< NodeId > knowingIds;
988  std::transform(
989  knowing.begin(),
990  knowing.end(),
991  std::back_inserter(knowingIds),
992  [this](const std::string& c) -> NodeId { return this->idFromName(c); });
993  return G2(idFromName(name1), idFromName(name2), knowingIds);
994  }
std::pair< double, double > G2(const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
Return the <statistic,pvalue> pair for for G2 test in the database.
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Size NodeId
Type for node ids.
Definition: graphElements.h:97

◆ getAprioriType__()

INLINE const std::string & gum::learning::genericBNLearner::getAprioriType__ ( ) const
protectedinherited

returns the type (as a string) of a given apriori

Definition at line 506 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

506  {
507  switch (apriori_type__) {
510 
513 
516 
517  case AprioriType::BDEU:
519 
520  default:
521  GUM_ERROR(OperationNotAllowed,
522  "genericBNLearner getAprioriType does "
523  "not support yet this apriori");
524  }
525  }
static const std::string type
Definition: aprioriTypes.h:42
static const std::string type
Definition: aprioriTypes.h:47
static const std::string type
Definition: aprioriTypes.h:52
static const std::string type
Definition: aprioriTypes.h:37
AprioriType apriori_type__
the a priori selected for the score and parameters
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54
+ Here is the call graph for this function:

◆ hasMissingValues()

INLINE bool gum::learning::genericBNLearner::hasMissingValues ( ) const
inherited

returns true if the learner's database has missing values

Definition at line 305 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

305  {
307  }
Database score_database__
the database to be used by the scores and parameter estimators
bool hasMissingValues() const
indicates whether the database contains some missing values
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:

◆ history()

const std::vector< double >& gum::learning::genericBNLearner::history ( ) const
inlinevirtualinherited
Exceptions
OperationNotAllowedif scheme not performed or verbosity=false

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1143 of file genericBNLearner.h.

1143  {
1144  if (current_algorithm__ != nullptr)
1145  return current_algorithm__->history();
1146  else
1147  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1148  };
const ApproximationScheme * current_algorithm__
const std::vector< double > & history() const
Returns the scheme history.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ idFromName()

INLINE NodeId gum::learning::genericBNLearner::idFromName ( const std::string &  var_name) const
inherited

returns the node id corresponding to a variable name

Exceptions
MissingVariableInDatabaseif a variable of the BN is not found in the database.

Definition at line 146 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

146  {
147  return score_database__.idFromName(var_name);
148  }
Database score_database__
the database to be used by the scores and parameter estimators
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ isEnabledEpsilon()

bool gum::learning::genericBNLearner::isEnabledEpsilon ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on epsilon is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 941 of file genericBNLearner.h.

941  {
942  if (current_algorithm__ != nullptr)
944  else
945  GUM_ERROR(FatalError, "No chosen algorithm for learning");
946  };
const ApproximationScheme * current_algorithm__
bool isEnabledEpsilon() const
Returns true if stopping criterion on epsilon is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ isEnabledMaxIter()

bool gum::learning::genericBNLearner::isEnabledMaxIter ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on max iterations is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1029 of file genericBNLearner.h.

1029  {
1030  if (current_algorithm__ != nullptr)
1032  else
1033  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1034  };
const ApproximationScheme * current_algorithm__
bool isEnabledMaxIter() const
Returns true if stopping criterion on max iterations is enabled, false otherwise. ...
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ isEnabledMaxTime()

bool gum::learning::genericBNLearner::isEnabledMaxTime ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on timeout is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1081 of file genericBNLearner.h.

1081  {
1082  if (current_algorithm__ != nullptr)
1084  else
1085  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1086  };
const ApproximationScheme * current_algorithm__
bool isEnabledMaxTime() const
Returns true if stopping criterion on timeout is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ isEnabledMinEpsilonRate()

bool gum::learning::genericBNLearner::isEnabledMinEpsilonRate ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on epsilon rate is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 985 of file genericBNLearner.h.

985  {
986  if (current_algorithm__ != nullptr)
988  else
989  GUM_ERROR(FatalError, "No chosen algorithm for learning");
990  };
const ApproximationScheme * current_algorithm__
bool isEnabledMinEpsilonRate() const
Returns true if stopping criterion on epsilon rate is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ labelsFromBN__()

template<typename GUM_SCALAR >
NodeProperty< Sequence< std::string > > gum::learning::BNLearner< GUM_SCALAR >::labelsFromBN__ ( const std::string &  filename,
const BayesNet< GUM_SCALAR > &  src 
)
private

read the first line of a file to find column names

◆ latentVariables()

INLINE const std::vector< Arc > gum::learning::genericBNLearner::latentVariables ( ) const
inherited

get the list of arcs hiding latent variables

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 265 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

265  {
267  GUM_ERROR(OperationNotAllowed,
268  "You must use the 3off2 algorithm before selecting "
269  << "the latentVariables method");
270  }
271  return miic_3off2__.latentVariables();
272  }
AlgoType selected_algo__
the selected learning algorithm
const std::vector< Arc > latentVariables() const
get the list of arcs hiding latent variables
Definition: Miic.cpp:1237
Miic miic_3off2__
the 3off2 algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54
+ Here is the call graph for this function:

◆ learnBN()

template<typename GUM_SCALAR >
BayesNet< GUM_SCALAR > gum::learning::BNLearner< GUM_SCALAR >::learnBN ( )

learn a Bayes Net from a file (must have read the db before)

◆ learnDAG()

DAG gum::learning::genericBNLearner::learnDAG ( )
inherited

learn a structure from a file (must have read the db before)

Definition at line 658 of file genericBNLearner.cpp.

658  {
659  // create the score and the apriori
660  createApriori__();
661  createScore__();
662 
663  return learnDAG__();
664  }
void createApriori__()
create the apriori used for learning
DAG learnDAG__()
returns the DAG learnt
void createScore__()
create the score used for learning

◆ learnDAG__()

DAG gum::learning::genericBNLearner::learnDAG__ ( )
protectedinherited

returns the DAG learnt

Definition at line 694 of file genericBNLearner.cpp.

694  {
695  // check that the database does not contain any missing value
697  || ((apriori_database__ != nullptr)
700  GUM_ERROR(MissingValueInDatabase,
701  "For the moment, the BNLearner is unable to cope "
702  "with missing values in databases");
703  }
704  // add the mandatory arcs to the initial dag and remove the forbidden ones
705  // from the initial graph
706  DAG init_graph = initial_dag__;
707 
708  const ArcSet& mandatory_arcs = constraint_MandatoryArcs__.arcs();
709 
710  for (const auto& arc: mandatory_arcs) {
711  if (!init_graph.exists(arc.tail())) init_graph.addNodeWithId(arc.tail());
712 
713  if (!init_graph.exists(arc.head())) init_graph.addNodeWithId(arc.head());
714 
715  init_graph.addArc(arc.tail(), arc.head());
716  }
717 
718  const ArcSet& forbidden_arcs = constraint_ForbiddenArcs__.arcs();
719 
720  for (const auto& arc: forbidden_arcs) {
721  init_graph.eraseArc(arc);
722  }
723 
724  switch (selected_algo__) {
725  // ========================================================================
727  BNLearnerListener listener(this, miic_3off2__);
728  // create the mixedGraph and the corrected mutual information
729  MixedGraph mgraph = this->prepare_miic_3off2__();
730 
731  return miic_3off2__.learnStructure(*mutual_info__, mgraph);
732  }
733 
734  // ========================================================================
736  BNLearnerListener listener(this, greedy_hill_climbing__);
737  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
738  StructuralConstraintForbiddenArcs,
739  StructuralConstraintPossibleEdges,
740  StructuralConstraintSliceOrder >
741  gen_constraint;
742  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint)
744  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint)
746  static_cast< StructuralConstraintPossibleEdges& >(gen_constraint)
748  static_cast< StructuralConstraintSliceOrder& >(gen_constraint)
750 
751  GraphChangesGenerator4DiGraph< decltype(gen_constraint) > op_set(
752  gen_constraint);
753 
754  StructuralConstraintSetStatic< StructuralConstraintIndegree,
755  StructuralConstraintDAG >
756  sel_constraint;
757  static_cast< StructuralConstraintIndegree& >(sel_constraint)
759 
760  GraphChangesSelector4DiGraph< decltype(sel_constraint),
761  decltype(op_set) >
762  selector(*score__, sel_constraint, op_set);
763 
764  return greedy_hill_climbing__.learnStructure(selector, init_graph);
765  }
766 
767  // ========================================================================
769  BNLearnerListener listener(this, local_search_with_tabu_list__);
770  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
771  StructuralConstraintForbiddenArcs,
772  StructuralConstraintPossibleEdges,
773  StructuralConstraintSliceOrder >
774  gen_constraint;
775  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint)
777  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint)
779  static_cast< StructuralConstraintPossibleEdges& >(gen_constraint)
781  static_cast< StructuralConstraintSliceOrder& >(gen_constraint)
783 
784  GraphChangesGenerator4DiGraph< decltype(gen_constraint) > op_set(
785  gen_constraint);
786 
787  StructuralConstraintSetStatic< StructuralConstraintTabuList,
788  StructuralConstraintIndegree,
789  StructuralConstraintDAG >
790  sel_constraint;
791  static_cast< StructuralConstraintTabuList& >(sel_constraint)
793  static_cast< StructuralConstraintIndegree& >(sel_constraint)
795 
796  GraphChangesSelector4DiGraph< decltype(sel_constraint),
797  decltype(op_set) >
798  selector(*score__, sel_constraint, op_set);
799 
801  init_graph);
802  }
803 
804  // ========================================================================
805  case AlgoType::K2: {
806  BNLearnerListener listener(this, K2__.approximationScheme());
807  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
808  StructuralConstraintForbiddenArcs,
809  StructuralConstraintPossibleEdges >
810  gen_constraint;
811  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint)
813  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint)
815  static_cast< StructuralConstraintPossibleEdges& >(gen_constraint)
817 
818  GraphChangesGenerator4K2< decltype(gen_constraint) > op_set(
819  gen_constraint);
820 
821  // if some mandatory arcs are incompatible with the order, use a DAG
822  // constraint instead of a DiGraph constraint to avoid cycles
823  const ArcSet& mandatory_arcs
824  = static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint)
825  .arcs();
826  const Sequence< NodeId >& order = K2__.order();
827  bool order_compatible = true;
828 
829  for (const auto& arc: mandatory_arcs) {
830  if (order.pos(arc.tail()) >= order.pos(arc.head())) {
831  order_compatible = false;
832  break;
833  }
834  }
835 
836  if (order_compatible) {
837  StructuralConstraintSetStatic< StructuralConstraintIndegree,
838  StructuralConstraintDiGraph >
839  sel_constraint;
840  static_cast< StructuralConstraintIndegree& >(sel_constraint)
842 
843  GraphChangesSelector4DiGraph< decltype(sel_constraint),
844  decltype(op_set) >
845  selector(*score__, sel_constraint, op_set);
846 
847  return K2__.learnStructure(selector, init_graph);
848  } else {
849  StructuralConstraintSetStatic< StructuralConstraintIndegree,
850  StructuralConstraintDAG >
851  sel_constraint;
852  static_cast< StructuralConstraintIndegree& >(sel_constraint)
854 
855  GraphChangesSelector4DiGraph< decltype(sel_constraint),
856  decltype(op_set) >
857  selector(*score__, sel_constraint, op_set);
858 
859  return K2__.learnStructure(selector, init_graph);
860  }
861  }
862 
863  // ========================================================================
864  default:
865  GUM_ERROR(OperationNotAllowed,
866  "the learnDAG method has not been implemented for this "
867  "learning algorithm");
868  }
869  }
StructuralConstraintPossibleEdges constraint_PossibleEdges__
the constraint on possible Edges
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
Idx pos(const Key &key) const
Returns the position of the object passed in argument (if it exists).
Definition: sequence_tpl.h:519
StructuralConstraintIndegree constraint_Indegree__
the constraint for indegrees
Database score_database__
the database to be used by the scores and parameter estimators
const ArcSet & arcs() const
returns the set of mandatory arcs
StructuralConstraintSliceOrder constraint_SliceOrder__
the constraint for 2TBNs
Set< Arc > ArcSet
Some typdefs and define for shortcuts ...
AlgoType selected_algo__
the selected learning algorithm
Score * score__
the score used
const Sequence< NodeId > & order() const noexcept
returns the current order
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
bool hasMissingValues() const
indicates whether the database contains some missing values
StructuralConstraintMandatoryArcs constraint_MandatoryArcs__
the constraint on forbidden arcs
const ArcSet & arcs() const
returns the set of mandatory arcs
const DatabaseTable & databaseTable() const
returns the internal database table
Database * apriori_database__
the database used by the Dirichlet a priori
Miic miic_3off2__
the 3off2 algorithm
DAG initial_dag__
an initial DAG given to learners
CorrectedMutualInformation * mutual_info__
the selected correction for 3off2 and miic
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
Definition: K2_tpl.h:40
StructuralConstraintForbiddenArcs constraint_ForbiddenArcs__
the constraint on forbidden arcs
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm
AprioriType apriori_type__
the a priori selected for the score and parameters
StructuralConstraintTabuList constraint_TabuList__
the constraint for tabu lists
MixedGraph prepare_miic_3off2__()
prepares the initial graph for 3off2 or miic
DAG learnStructure(CorrectedMutualInformation<> &I, MixedGraph graph)
learns the structure of an Bayesian network, ie a DAG, by first learning an Essential graph and then ...
Definition: Miic.cpp:1126
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ learnMixedStructure()

MixedGraph gum::learning::genericBNLearner::learnMixedStructure ( )
inherited

learn a partial structure from a file (must have read the db before and must have selected miic or 3off2)

Definition at line 640 of file genericBNLearner.cpp.

640  {
642  GUM_ERROR(OperationNotAllowed, "Must be using the miic/3off2 algorithm");
643  }
644  // check that the database does not contain any missing value
646  GUM_ERROR(MissingValueInDatabase,
647  "For the moment, the BNLearner is unable to learn "
648  << "structures with missing values in databases");
649  }
650  BNLearnerListener listener(this, miic_3off2__);
651 
652  // create the mixedGraph_constraint_MandatoryArcs.arcs();
653  MixedGraph mgraph = this->prepare_miic_3off2__();
654 
656  }
Database score_database__
the database to be used by the scores and parameter estimators
MixedGraph learnMixedStructure(CorrectedMutualInformation<> &I, MixedGraph graph)
learns the structure of an Essential Graph
Definition: Miic.cpp:119
AlgoType selected_algo__
the selected learning algorithm
bool hasMissingValues() const
indicates whether the database contains some missing values
const DatabaseTable & databaseTable() const
returns the internal database table
Miic miic_3off2__
the 3off2 algorithm
CorrectedMutualInformation * mutual_info__
the selected correction for 3off2 and miic
MixedGraph prepare_miic_3off2__()
prepares the initial graph for 3off2 or miic
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ learnParameters() [1/2]

template<typename GUM_SCALAR >
BayesNet< GUM_SCALAR > gum::learning::BNLearner< GUM_SCALAR >::learnParameters ( const DAG dag,
bool  take_into_account_score = true 
)

learns a BN (its parameters) when its structure is known

Parameters
dagthe structure of the Bayesian network
take_into_account_scoreThe dag passed in argument may have been learnt from a structure learning. In this case, if the score used to learn the structure has an implicit apriori (like K2 which has a 1-smoothing apriori), it is important to also take into account this implicit apriori for parameter learning. By default, if a score exists, we will learn parameters by taking into account the apriori specified by methods useAprioriXXX () + the implicit apriori of the score, else we just take into account the apriori specified by useAprioriXXX ()

◆ learnParameters() [2/2]

template<typename GUM_SCALAR >
BayesNet< GUM_SCALAR > gum::learning::BNLearner< GUM_SCALAR >::learnParameters ( bool  take_into_account_score = true)
Parameters
take_into_account_scoreThe dag of the BN which was passed in argument to the BNLearner may have been learnt from a structure learning. In this case, if the score used to learn the structure has an implicit apriori (like K2 which has a 1-smoothing apriori), it is important to also take into account this implicit apriori for parameter learning. By default, if a score exists, we will learn parameters by taking into account the apriori specified by methods useAprioriXXX () + the implicit apriori of the score, else we just take into account the apriori specified by useAprioriXXX ()
Exceptions
MissingVariableInDatabaseif a variable of the BN is not found in the database.
UnknownLabelInDatabaseif a label is found in the databast that do not correpond to the variable.

◆ logLikelihood() [1/2]

double gum::learning::genericBNLearner::logLikelihood ( const std::vector< NodeId > &  vars,
const std::vector< NodeId > &  knowing = {} 
)
inherited

Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.

Parameters
varsa vector of NodeIds
knowingan optional vector of conditioning NodeIds
Returns
a std::pair<double,double>

Definition at line 996 of file genericBNLearner.cpp.

997  {
998  createApriori__();
1000  *apriori__,
1001  databaseRanges());
1002 
1003  std::vector< NodeId > total(vars);
1004  total.insert(total.end(), knowing.begin(), knowing.end());
1005  double LLtotal = ll2score.score(IdCondSet<>(total, false, true));
1006  if (knowing.size() == (Size)0) {
1007  return LLtotal;
1008  } else {
1009  double LLknw = ll2score.score(IdCondSet<>(knowing, false, true));
1010  return LLtotal - LLknw;
1011  }
1012  }
Database score_database__
the database to be used by the scores and parameter estimators
the class for computing Log2-likelihood scores
void createApriori__()
create the apriori used for learning
Apriori * apriori__
the apriori used
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
DBRowGeneratorParser & parser()
returns the parser for the database
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:47

◆ logLikelihood() [2/2]

double gum::learning::genericBNLearner::logLikelihood ( const std::vector< std::string > &  vars,
const std::vector< std::string > &  knowing = {} 
)
inherited

Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.

Parameters
varsa vector of name of rows
knowingan optional vector of conditioning rows
Returns
a std::pair<double,double>

Definition at line 1015 of file genericBNLearner.cpp.

1016  {
1017  std::vector< NodeId > ids;
1018  std::vector< NodeId > knowingIds;
1019 
1020  auto mapper = [this](const std::string& c) -> NodeId {
1021  return this->idFromName(c);
1022  };
1023 
1024  std::transform(vars.begin(), vars.end(), std::back_inserter(ids), mapper);
1025  std::transform(knowing.begin(),
1026  knowing.end(),
1027  std::back_inserter(knowingIds),
1028  mapper);
1029 
1030  return logLikelihood(ids, knowingIds);
1031  }
double logLikelihood(const std::vector< NodeId > &vars, const std::vector< NodeId > &knowing={})
Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Size NodeId
Type for node ids.
Definition: graphElements.h:97

◆ maxIter()

Size gum::learning::genericBNLearner::maxIter ( ) const
inlinevirtualinherited
Returns
the criterion on number of iterations

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1006 of file genericBNLearner.h.

1006  {
1007  if (current_algorithm__ != nullptr)
1008  return current_algorithm__->maxIter();
1009  else
1010  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1011  };
const ApproximationScheme * current_algorithm__
Size maxIter() const
Returns the criterion on number of iterations.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ maxTime()

double gum::learning::genericBNLearner::maxTime ( ) const
inlinevirtualinherited

returns the timeout (in seconds)

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1051 of file genericBNLearner.h.

1051  {
1052  if (current_algorithm__ != nullptr)
1053  return current_algorithm__->maxTime();
1054  else
1055  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1056  };
double maxTime() const
Returns the timeout (in seconds).
const ApproximationScheme * current_algorithm__
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ messageApproximationScheme()

INLINE std::string gum::IApproximationSchemeConfiguration::messageApproximationScheme ( ) const
inherited

Returns the approximation scheme message.

Returns
Returns the approximation scheme message.

Definition at line 39 of file IApproximationSchemeConfiguration_inl.h.

References gum::Set< Key, Alloc >::emplace().

39  {
40  std::stringstream s;
41 
42  switch (stateApproximationScheme()) {
44  s << "in progress";
45  break;
46 
48  s << "stopped with epsilon=" << epsilon();
49  break;
50 
52  s << "stopped with rate=" << minEpsilonRate();
53  break;
54 
56  s << "stopped with max iteration=" << maxIter();
57  break;
58 
60  s << "stopped with timeout=" << maxTime();
61  break;
62 
64  s << "stopped on request";
65  break;
66 
68  s << "undefined state";
69  break;
70  };
71 
72  return s.str();
73  }
virtual double epsilon() const =0
Returns the value of epsilon.
virtual ApproximationSchemeSTATE stateApproximationScheme() const =0
Returns the approximation scheme state.
virtual double maxTime() const =0
Returns the timeout (in seconds).
virtual Size maxIter() const =0
Returns the criterion on number of iterations.
virtual double minEpsilonRate() const =0
Returns the value of the minimal epsilon rate.
+ Here is the call graph for this function:

◆ minEpsilonRate()

double gum::learning::genericBNLearner::minEpsilonRate ( ) const
inlinevirtualinherited

Get the value of the minimal epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 962 of file genericBNLearner.h.

962  {
963  if (current_algorithm__ != nullptr)
965  else
966  GUM_ERROR(FatalError, "No chosen algorithm for learning");
967  };
double minEpsilonRate() const
Returns the value of the minimal epsilon rate.
const ApproximationScheme * current_algorithm__
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ nameFromId()

INLINE const std::string & gum::learning::genericBNLearner::nameFromId ( NodeId  id) const
inherited

returns the variable name corresponding to a given node id

Definition at line 151 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

151  {
152  return score_database__.nameFromId(id);
153  }
Database score_database__
the database to be used by the scores and parameter estimators
const std::string & nameFromId(NodeId id) const
returns the variable name corresponding to a given node id
+ Here is the call graph for this function:

◆ names()

INLINE const std::vector< std::string > & gum::learning::genericBNLearner::names ( ) const
inherited

returns the names of the variables in the database

Definition at line 528 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

528  {
529  return score_database__.names();
530  }
Database score_database__
the database to be used by the scores and parameter estimators
const std::vector< std::string > & names() const
returns the names of the variables in the database
+ Here is the call graph for this function:

◆ nbCols()

INLINE Size gum::learning::genericBNLearner::nbCols ( ) const
inherited
Returns
the number of cols in the database

Definition at line 561 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

561  {
562  return score_database__.domainSizes().size();
563  }
Database score_database__
the database to be used by the scores and parameter estimators
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
+ Here is the call graph for this function:

◆ nbrIterations()

Size gum::learning::genericBNLearner::nbrIterations ( ) const
inlinevirtualinherited
Exceptions
OperationNotAllowedif scheme not performed

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1135 of file genericBNLearner.h.

1135  {
1136  if (current_algorithm__ != nullptr)
1138  else
1139  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1140  };
const ApproximationScheme * current_algorithm__
Size nbrIterations() const
Returns the number of iterations.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ nbRows()

INLINE Size gum::learning::genericBNLearner::nbRows ( ) const
inherited
Returns
the number of rows in the database

Definition at line 565 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

565  {
567  }
Database score_database__
the database to be used by the scores and parameter estimators
std::size_t size() const noexcept
returns the number of records (rows) in the database
const DatabaseTable & databaseTable() const
returns the internal database table
+ Here is the call graph for this function:

◆ operator=() [1/2]

template<typename GUM_SCALAR >
BNLearner& gum::learning::BNLearner< GUM_SCALAR >::operator= ( const BNLearner< GUM_SCALAR > &  )

copy operator

◆ operator=() [2/2]

template<typename GUM_SCALAR >
BNLearner& gum::learning::BNLearner< GUM_SCALAR >::operator= ( BNLearner< GUM_SCALAR > &&  )

move operator

◆ periodSize()

Size gum::learning::genericBNLearner::periodSize ( ) const
inlinevirtualinherited

how many samples between 2 stopping isEnableds

Exceptions
OutOfLowerBoundif p<1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1099 of file genericBNLearner.h.

1099  {
1100  if (current_algorithm__ != nullptr)
1101  return current_algorithm__->periodSize();
1102  else
1103  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1104  };
const ApproximationScheme * current_algorithm__
Size periodSize() const
Returns the period size.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ prepare_miic_3off2__()

MixedGraph gum::learning::genericBNLearner::prepare_miic_3off2__ ( )
protectedinherited

prepares the initial graph for 3off2 or miic

Definition at line 610 of file genericBNLearner.cpp.

610  {
611  // Initialize the mixed graph to the fully connected graph
612  MixedGraph mgraph;
613  for (Size i = 0; i < score_database__.databaseTable().nbVariables(); ++i) {
614  mgraph.addNodeWithId(i);
615  for (Size j = 0; j < i; ++j) {
616  mgraph.addEdge(j, i);
617  }
618  }
619 
620  // translating the constraints for 3off2 or miic
621  HashTable< std::pair< NodeId, NodeId >, char > initial_marks;
622  const ArcSet& mandatory_arcs = constraint_MandatoryArcs__.arcs();
623  for (const auto& arc: mandatory_arcs) {
624  initial_marks.insert({arc.tail(), arc.head()}, '>');
625  }
626 
627  const ArcSet& forbidden_arcs = constraint_ForbiddenArcs__.arcs();
628  for (const auto& arc: forbidden_arcs) {
629  initial_marks.insert({arc.tail(), arc.head()}, '-');
630  }
631  miic_3off2__.addConstraints(initial_marks);
632 
633  // create the mutual entropy object
634  // if (mutual_info__ == nullptr) { this->useNML(); }
636 
637  return mgraph;
638  }
Database score_database__
the database to be used by the scores and parameter estimators
const ArcSet & arcs() const
returns the set of mandatory arcs
void addConstraints(HashTable< std::pair< NodeId, NodeId >, char > constraints)
Set a ensemble of constraints for the orientation phase.
Definition: Miic.cpp:1256
void createCorrectedMutualInformation__()
create the Corrected Mutual Information instance for Miic/3off2
Set< Arc > ArcSet
Some typdefs and define for shortcuts ...
std::size_t nbVariables() const noexcept
returns the number of variables (columns) of the database
StructuralConstraintMandatoryArcs constraint_MandatoryArcs__
the constraint on forbidden arcs
const ArcSet & arcs() const
returns the set of mandatory arcs
const DatabaseTable & databaseTable() const
returns the internal database table
Miic miic_3off2__
the 3off2 algorithm
StructuralConstraintForbiddenArcs constraint_ForbiddenArcs__
the constraint on forbidden arcs
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:47
void insert(const Key &k)
Inserts a new element into the set.
Definition: set_tpl.h:632

◆ rawPseudoCount() [1/2]

std::vector< double > gum::learning::genericBNLearner::rawPseudoCount ( const std::vector< NodeId > &  vars)
inherited

Return the pseudoconts ofNodeIds vars in the base in a raw array.

Parameters
varsa vector of
Returns
a a std::vector<double> containing the contingency table

Definition at line 1034 of file genericBNLearner.cpp.

1034  {
1035  Potential< double > res;
1036 
1037  createApriori__();
1039  *apriori__,
1040  databaseRanges());
1041  return count.get(vars);
1042  }
Database score_database__
the database to be used by the scores and parameter estimators
The class for giving access to pseudo count : count in the database + prior.
Definition: pseudoCount.h:52
std::vector< double, ALLOC< double > > get(const std::vector< NodeId, ALLOC< NodeId > > &ids)
returns the pseudo-count of a pair of nodes given some other nodes
void createApriori__()
create the apriori used for learning
Apriori * apriori__
the apriori used
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
DBRowGeneratorParser & parser()
returns the parser for the database

◆ rawPseudoCount() [2/2]

std::vector< double > gum::learning::genericBNLearner::rawPseudoCount ( const std::vector< std::string > &  vars)
inherited

Return the pseudoconts of vars in the base in a raw array.

Parameters
varsa vector of name
Returns
a std::vector<double> containing the contingency table

Definition at line 1046 of file genericBNLearner.cpp.

1046  {
1047  std::vector< NodeId > ids;
1048 
1049  auto mapper = [this](const std::string& c) -> NodeId {
1050  return this->idFromName(c);
1051  };
1052 
1053  std::transform(vars.begin(), vars.end(), std::back_inserter(ids), mapper);
1054 
1055  return rawPseudoCount(ids);
1056  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
std::vector< double > rawPseudoCount(const std::vector< NodeId > &vars)
Return the pseudoconts ofNodeIds vars in the base in a raw array.
Size NodeId
Type for node ids.
Definition: graphElements.h:97

◆ readFile__()

DatabaseTable gum::learning::genericBNLearner::readFile__ ( const std::string &  filename,
const std::vector< std::string > &  missing_symbols 
)
staticprotectedinherited

reads a file and returns a databaseVectInRam

Definition at line 438 of file genericBNLearner.cpp.

440  {
441  // get the extension of the file
442  checkFileName__(filename);
443 
444  DBInitializerFromCSV<> initializer(filename);
445 
446  const auto& var_names = initializer.variableNames();
447  const std::size_t nb_vars = var_names.size();
448 
449  DBTranslatorSet<> translator_set;
450  DBTranslator4LabelizedVariable<> translator(missing_symbols);
451  for (std::size_t i = 0; i < nb_vars; ++i) {
452  translator_set.insertTranslator(translator, i);
453  }
454 
455  DatabaseTable<> database(missing_symbols, translator_set);
456  database.setVariableNames(initializer.variableNames());
457  initializer.fillDatabase(database);
458 
459  database.reorder();
460 
461  return database;
462  }
static void checkFileName__(const std::string &filename)
checks whether the extension of a CSV filename is correct
virtual void setVariableNames(const std::vector< std::string, ALLOC< std::string > > &names, const bool from_external_object=true) final
sets the names of the variables
const DatabaseTable & database() const
returns the database used by the BNLearner
void reorder(const std::size_t k, const bool k_is_input_col=false)
performs a reordering of the kth translator or of the first translator parsing the kth column of the ...

◆ recordWeight()

INLINE double gum::learning::genericBNLearner::recordWeight ( const std::size_t  i) const
inherited

returns the weight of the ith record

Exceptions
OutOfBoundsif i is outside the set of indices of the records

Definition at line 167 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

167  {
168  return score_database__.weight(i);
169  }
Database score_database__
the database to be used by the scores and parameter estimators
double weight(const std::size_t i) const
returns the weight of the ith record
+ Here is the call graph for this function:

◆ setAprioriWeight__()

INLINE void gum::learning::genericBNLearner::setAprioriWeight__ ( double  weight)
inherited

sets the apriori weight

Definition at line 450 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

450  {
451  if (weight < 0) {
452  GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive");
453  }
454 
455  apriori_weight__ = weight;
457  }
double apriori_weight__
the weight of the apriori
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54
+ Here is the call graph for this function:

◆ setCurrentApproximationScheme()

INLINE void gum::learning::genericBNLearner::setCurrentApproximationScheme ( const ApproximationScheme approximationScheme)
inlineinherited

{@ /// distribute signals

Definition at line 880 of file genericBNLearner.h.

881  {
882  current_algorithm__ = approximationScheme;
883  }
const ApproximationScheme * current_algorithm__

◆ setDatabaseWeight()

INLINE void gum::learning::genericBNLearner::setDatabaseWeight ( const double  new_weight)
inherited

assign a weight to all the rows of the learning database so that the sum of their weights is equal to new_weight

assign new weight to the rows of the learning database

Definition at line 156 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

156  {
158  }
Database score_database__
the database to be used by the scores and parameter estimators
void setDatabaseWeight(const double new_weight)
assign a weight to all the rows of the database so that the sum of their weights is equal to new_weig...
+ Here is the call graph for this function:

◆ setEpsilon()

void gum::learning::genericBNLearner::setEpsilon ( double  eps)
inlinevirtualinherited

Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)| If the criterion was disabled it will be enabled.

Exceptions
OutOfLowerBoundif eps<0

Implements gum::IApproximationSchemeConfiguration.

Definition at line 908 of file genericBNLearner.h.

908  {
912  Dag2BN__.setEpsilon(eps);
913  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
DAG2BNLearner Dag2BN__
the parametric EM
void setEpsilon(double eps)
Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)|.
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm

◆ setForbiddenArcs()

INLINE void gum::learning::genericBNLearner::setForbiddenArcs ( const ArcSet set)
inherited

assign a set of forbidden arcs

Definition at line 353 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

353  {
355  }
void setArcs(const ArcSet &set)
assign a set of forbidden arcs
StructuralConstraintForbiddenArcs constraint_ForbiddenArcs__
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ setInitialDAG()

INLINE void gum::learning::genericBNLearner::setInitialDAG ( const DAG dag)
inherited

sets an initial DAG structure

Definition at line 177 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

177  {
178  initial_dag__ = dag;
179  }
DAG initial_dag__
an initial DAG given to learners
+ Here is the call graph for this function:

◆ setMandatoryArcs()

INLINE void gum::learning::genericBNLearner::setMandatoryArcs ( const ArcSet set)
inherited

assign a set of forbidden arcs

Definition at line 392 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

392  {
394  }
void setArcs(const ArcSet &set)
assign a set of forbidden arcs
StructuralConstraintMandatoryArcs constraint_MandatoryArcs__
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ setMaxIndegree()

INLINE void gum::learning::genericBNLearner::setMaxIndegree ( Size  max_indegree)
inherited

sets the max indegree

Definition at line 218 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

218  {
220  }
StructuralConstraintIndegree constraint_Indegree__
the constraint for indegrees
void setMaxIndegree(Size max_indegree, bool update_all_node=false)
resets the default max indegree and possibly updates the indegree of all nodes
+ Here is the call graph for this function:

◆ setMaxIter()

void gum::learning::genericBNLearner::setMaxIter ( Size  max)
inlinevirtualinherited

stopping criterion on number of iterationsIf the criterion was disabled it will be enabled

Parameters
maxThe maximum number of iterations
Exceptions
OutOfLowerBoundif max<=1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 998 of file genericBNLearner.h.

998  {
1002  Dag2BN__.setMaxIter(max);
1003  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
DAG2BNLearner Dag2BN__
the parametric EM
void setMaxIter(Size max)
Stopping criterion on number of iterations.
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm

◆ setMaxTime()

void gum::learning::genericBNLearner::setMaxTime ( double  timeout)
inlinevirtualinherited

stopping criterion on timeout If the criterion was disabled it will be enabled

Exceptions
OutOfLowerBoundif timeout<=0.0 timeout is time in second (double).

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1043 of file genericBNLearner.h.

1043  {
1044  K2__.approximationScheme().setMaxTime(timeout);
1047  Dag2BN__.setMaxTime(timeout);
1048  }
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setMaxTime(double timeout)
Stopping criterion on timeout.
DAG2BNLearner Dag2BN__
the parametric EM
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm

◆ setMinEpsilonRate()

void gum::learning::genericBNLearner::setMinEpsilonRate ( double  rate)
inlinevirtualinherited

Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|) If the criterion was disabled it will be enabled.

Exceptions
OutOfLowerBoundif rate<0

Implements gum::IApproximationSchemeConfiguration.

Definition at line 954 of file genericBNLearner.h.

954  {
959  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setMinEpsilonRate(double rate)
Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|).
DAG2BNLearner Dag2BN__
the parametric EM
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm

◆ setPeriodSize()

void gum::learning::genericBNLearner::setPeriodSize ( Size  p)
inlinevirtualinherited

how many samples between 2 stopping isEnableds

Exceptions
OutOfLowerBoundif p<1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1092 of file genericBNLearner.h.

1092  {
1097  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setPeriodSize(Size p)
How many samples between two stopping is enable.
DAG2BNLearner Dag2BN__
the parametric EM
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm

◆ setPossibleEdges()

INLINE void gum::learning::genericBNLearner::setPossibleEdges ( const EdgeSet set)
inherited

assign a set of forbidden edges

Warning
Once at least one possible edge is defined, all other edges are not possible anymore

Definition at line 310 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

310  {
312  }
StructuralConstraintPossibleEdges constraint_PossibleEdges__
the constraint on possible Edges
void setEdges(const EdgeSet &set)
assign a set of forbidden arcs
+ Here is the call graph for this function:

◆ setPossibleSkeleton()

INLINE void gum::learning::genericBNLearner::setPossibleSkeleton ( const UndiGraph skeleton)
inherited

assign a set of forbidden edges

Warning
Once at least one possible edge is defined, all other edges are not possible anymore

Definition at line 314 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

314  {
315  setPossibleEdges(g.edges());
316  }
void setPossibleEdges(const EdgeSet &set)
assign a set of forbidden edges
+ Here is the call graph for this function:

◆ setRecordWeight()

INLINE void gum::learning::genericBNLearner::setRecordWeight ( const std::size_t  i,
const double  weight 
)
inherited

sets the weight of the ith record of the database

assign new weight to the ith row of the learning database

Exceptions
OutOfBoundsif i is outside the set of indices of the records or if the weight is negative

Definition at line 161 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

162  {
163  score_database__.setWeight(i, new_weight);
164  }
Database score_database__
the database to be used by the scores and parameter estimators
void setWeight(const std::size_t i, const double weight)
sets the weight of the ith record
+ Here is the call graph for this function:

◆ setSliceOrder() [1/2]

INLINE void gum::learning::genericBNLearner::setSliceOrder ( const NodeProperty< NodeId > &  slice_order)
inherited

sets a partial order on the nodes

Parameters
slice_ordera NodeProperty given the rank (priority) of nodes in the partial order

Definition at line 432 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

432  {
433  constraint_SliceOrder__ = StructuralConstraintSliceOrder(slice_order);
434  }
StructuralConstraintSliceOrder constraint_SliceOrder__
the constraint for 2TBNs
+ Here is the call graph for this function:

◆ setSliceOrder() [2/2]

INLINE void gum::learning::genericBNLearner::setSliceOrder ( const std::vector< std::vector< std::string > > &  slices)
inherited

sets a partial order on the nodes

Parameters
slicesthe list of list of variable names

Definition at line 436 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

437  {
438  NodeProperty< NodeId > slice_order;
439  NodeId rank = 0;
440  for (const auto& slice: slices) {
441  for (const auto& name: slice) {
442  slice_order.insert(idFromName(name), rank);
443  }
444  rank++;
445  }
446  setSliceOrder(slice_order);
447  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
void setSliceOrder(const NodeProperty< NodeId > &slice_order)
sets a partial order on the nodes
Size NodeId
Type for node ids.
Definition: graphElements.h:97
+ Here is the call graph for this function:

◆ setVerbosity()

void gum::learning::genericBNLearner::setVerbosity ( bool  v)
inlinevirtualinherited

verbosity

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1109 of file genericBNLearner.h.

1109  {
1114  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setVerbosity(bool v)
Set the verbosity on (true) or off (false).
DAG2BNLearner Dag2BN__
the parametric EM
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
GreedyHillClimbing greedy_hill_climbing__
the greedy hill climbing algorithm

◆ stateApproximationScheme()

ApproximationSchemeSTATE gum::learning::genericBNLearner::stateApproximationScheme ( ) const
inlinevirtualinherited

history

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1127 of file genericBNLearner.h.

1127  {
1128  if (current_algorithm__ != nullptr)
1130  else
1131  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1132  };
const ApproximationScheme * current_algorithm__
ApproximationSchemeSTATE stateApproximationScheme() const
Returns the approximation scheme state.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ use3off2()

INLINE void gum::learning::genericBNLearner::use3off2 ( )
inherited

indicate that we wish to use 3off2

Definition at line 223 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

223  {
226  }
void set3off2Behaviour()
Sets the orientation phase to follow the one of the 3off2 algorithm.
Definition: Miic.cpp:1254
AlgoType selected_algo__
the selected learning algorithm
Miic miic_3off2__
the 3off2 algorithm
+ Here is the call graph for this function:

◆ useAprioriBDeu()

INLINE void gum::learning::genericBNLearner::useAprioriBDeu ( double  weight = 1)
inherited

use the BDeu apriori

The BDeu apriori adds weight to all the cells of the countings tables. In other words, it adds weight rows in the database with equally probable values.

Definition at line 493 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

493  {
494  if (weight < 0) {
495  GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive");
496  }
497 
499  setAprioriWeight__(weight);
500 
502  }
void setAprioriWeight__(double weight)
sets the apriori weight
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
AprioriType apriori_type__
the a priori selected for the score and parameters
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54
+ Here is the call graph for this function:

◆ useAprioriDirichlet()

INLINE void gum::learning::genericBNLearner::useAprioriDirichlet ( const std::string &  filename,
double  weight = 1 
)
inherited

use the Dirichlet apriori

Definition at line 478 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

479  {
480  if (weight < 0) {
481  GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive");
482  }
483 
484  apriori_dbname__ = filename;
486  setAprioriWeight__(weight);
487 
489  }
void setAprioriWeight__(double weight)
sets the apriori weight
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
std::string apriori_dbname__
the filename for the Dirichlet a priori, if any
AprioriType apriori_type__
the a priori selected for the score and parameters
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54
+ Here is the call graph for this function:

◆ useAprioriSmoothing()

INLINE void gum::learning::genericBNLearner::useAprioriSmoothing ( double  weight = 1)
inherited

use the apriori smoothing

Parameters
weightpass in argument a weight if you wish to assign a weight to the smoothing, else the current weight of the genericBNLearner will be used.

Definition at line 466 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

466  {
467  if (weight < 0) {
468  GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive");
469  }
470 
472  setAprioriWeight__(weight);
473 
475  }
void setAprioriWeight__(double weight)
sets the apriori weight
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
AprioriType apriori_type__
the a priori selected for the score and parameters
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54
+ Here is the call graph for this function:

◆ useCrossValidationFold()

std::pair< std::size_t, std::size_t > gum::learning::genericBNLearner::useCrossValidationFold ( const std::size_t  learning_fold,
const std::size_t  k_fold 
)
inherited

sets the ranges of rows to be used for cross-validation learning

When applied on (x,k), the method indicates to the subsequent learnings that they should be performed on the xth fold in a k-fold cross-validation context. For instance, if a database has 1000 rows, and if we perform a 10-fold cross-validation, then, the first learning fold (learning_fold=0) corresponds to rows interval [100,1000) and the test dataset corresponds to [0,100). The second learning fold (learning_fold=1) is [0,100) U [200,1000) and the corresponding test dataset is [100,200).

Parameters
learning_folda number indicating the set of rows used for learning. If N denotes the size of the database, and k_fold represents the number of folds in the cross validation, then the set of rows used for testing is [learning_fold * N / k_fold, (learning_fold+1) * N / k_fold) and the learning database is the complement in the database
k_foldthe value of "k" in k-fold cross validation
Returns
a pair [x,y) of rows' indices that corresponds to the indices of rows in the original database that constitute the test dataset
Exceptions
OutOfBoundsis raised if k_fold is equal to 0 or learning_fold is greater than or eqal to k_fold, or if k_fold is greater than or equal to the size of the database.

Definition at line 902 of file genericBNLearner.cpp.

903  {
904  if (k_fold == 0) {
905  GUM_ERROR(OutOfBounds, "K-fold cross validation with k=0 is forbidden");
906  }
907 
908  if (learning_fold >= k_fold) {
909  GUM_ERROR(OutOfBounds,
910  "In " << k_fold << "-fold cross validation, the learning "
911  << "fold should be strictly lower than " << k_fold
912  << " but, here, it is equal to " << learning_fold);
913  }
914 
915  const std::size_t db_size = score_database__.databaseTable().nbRows();
916  if (k_fold >= db_size) {
917  GUM_ERROR(OutOfBounds,
918  "In " << k_fold << "-fold cross validation, the database's "
919  << "size should be strictly greater than " << k_fold
920  << " but, here, the database has only " << db_size
921  << "rows");
922  }
923 
924  // create the ranges of rows of the test database
925  const std::size_t foldSize = db_size / k_fold;
926  const std::size_t unfold_deb = learning_fold * foldSize;
927  const std::size_t unfold_end = unfold_deb + foldSize;
928 
929  ranges__.clear();
930  if (learning_fold == std::size_t(0)) {
931  ranges__.push_back(
932  std::pair< std::size_t, std::size_t >(unfold_end, db_size));
933  } else {
934  ranges__.push_back(
935  std::pair< std::size_t, std::size_t >(std::size_t(0), unfold_deb));
936 
937  if (learning_fold != k_fold - 1) {
938  ranges__.push_back(
939  std::pair< std::size_t, std::size_t >(unfold_end, db_size));
940  }
941  }
942 
943  return std::pair< std::size_t, std::size_t >(unfold_deb, unfold_end);
944  }
Database score_database__
the database to be used by the scores and parameter estimators
std::vector< std::pair< std::size_t, std::size_t > > ranges__
the set of rows&#39; ranges within the database in which learning is done
std::size_t nbRows() const noexcept
returns the number of records (rows) in the database
const DatabaseTable & databaseTable() const
returns the internal database table
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

◆ useDatabaseRanges()

template<template< typename > class XALLOC>
void gum::learning::genericBNLearner::useDatabaseRanges ( const std::vector< std::pair< std::size_t, std::size_t >, XALLOC< std::pair< std::size_t, std::size_t > > > &  new_ranges)
inherited

use a new set of database rows' ranges to perform learning

Parameters
rangesa set of pairs {(X1,Y1),...,(Xn,Yn)} of database's rows indices. The subsequent learnings are then performed only on the union of the rows [Xi,Yi), i in {1,...,n}. This is useful, e.g, when performing cross validation tasks, in which part of the database should be ignored. An empty set of ranges is equivalent to an interval [X,Y) ranging over the whole database.

Definition at line 101 of file genericBNLearner_tpl.h.

References gum::learning::genericBNLearner::Database::Database().

104  {
105  // use a score to detect whether the ranges are ok
106  ScoreLog2Likelihood<> score(score_database__.parser(), *no_apriori__);
107  score.setRanges(new_ranges);
108  ranges__ = score.ranges();
109  }
Database score_database__
the database to be used by the scores and parameter estimators
std::vector< std::pair< std::size_t, std::size_t > > ranges__
the set of rows&#39; ranges within the database in which learning is done
DBRowGeneratorParser & parser()
returns the parser for the database
+ Here is the call graph for this function:

◆ useEM()

INLINE void gum::learning::genericBNLearner::useEM ( const double  epsilon)
inherited

use The EM algorithm to learn paramters

if epsilon=0, EM is not used

Definition at line 300 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

300  {
302  }
double EMepsilon__
epsilon for EM. if espilon=0.0 : no EM
double epsilon() const
Get the value of epsilon.
+ Here is the call graph for this function:

◆ useGreedyHillClimbing()

INLINE void gum::learning::genericBNLearner::useGreedyHillClimbing ( )
inherited

indicate that we wish to use a greedy hill climbing algorithm

Definition at line 287 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

+ Here is the call graph for this function:

◆ useK2() [1/2]

INLINE void gum::learning::genericBNLearner::useK2 ( const Sequence< NodeId > &  order)
inherited

indicate that we wish to use K2

Definition at line 275 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

275  {
277  K2__.setOrder(order);
278  }
void setOrder(const Sequence< NodeId > &order)
sets the order on the variables
AlgoType selected_algo__
the selected learning algorithm
+ Here is the call graph for this function:

◆ useK2() [2/2]

INLINE void gum::learning::genericBNLearner::useK2 ( const std::vector< NodeId > &  order)
inherited

indicate that we wish to use K2

Definition at line 281 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

281  {
283  K2__.setOrder(order);
284  }
void setOrder(const Sequence< NodeId > &order)
sets the order on the variables
AlgoType selected_algo__
the selected learning algorithm
+ Here is the call graph for this function:

◆ useLocalSearchWithTabuList()

INLINE void gum::learning::genericBNLearner::useLocalSearchWithTabuList ( Size  tabu_size = 100,
Size  nb_decrease = 2 
)
inherited

indicate that we wish to use a local search with tabu list

Parameters
tabu_sizeindicate the size of the tabu list
nb_decreaseindicate the max number of changes decreasing the score consecutively that we allow to apply

Definition at line 292 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

293  {
297  }
void setTabuListSize(Size new_size)
sets the size of the tabu list
AlgoType selected_algo__
the selected learning algorithm
LocalSearchWithTabuList local_search_with_tabu_list__
the local search with tabu list algorithm
StructuralConstraintTabuList constraint_TabuList__
the constraint for tabu lists
void setMaxNbDecreasingChanges(Size nb)
set the max number of changes decreasing the score that we allow to apply
+ Here is the call graph for this function:

◆ useMDL()

INLINE void gum::learning::genericBNLearner::useMDL ( )
inherited

indicate that we wish to use the MDL correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 245 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

245  {
247  GUM_ERROR(OperationNotAllowed,
248  "You must use the 3off2 algorithm before selecting "
249  << "the MDL score");
250  }
252  }
AlgoType selected_algo__
the selected learning algorithm
CorrectedMutualInformation ::KModeTypes kmode_3off2__
the penalty used in 3off2
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54
+ Here is the call graph for this function:

◆ useMIIC()

INLINE void gum::learning::genericBNLearner::useMIIC ( )
inherited

indicate that we wish to use MIIC

Definition at line 229 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

229  {
232  }
AlgoType selected_algo__
the selected learning algorithm
void setMiicBehaviour()
Sets the orientation phase to follow the one of the MIIC algorithm.
Definition: Miic.cpp:1253
Miic miic_3off2__
the 3off2 algorithm
+ Here is the call graph for this function:

◆ useNML()

INLINE void gum::learning::genericBNLearner::useNML ( )
inherited

indicate that we wish to use the NML correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 235 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

235  {
237  GUM_ERROR(OperationNotAllowed,
238  "You must use the 3off2 algorithm before selecting "
239  << "the NML score");
240  }
242  }
AlgoType selected_algo__
the selected learning algorithm
CorrectedMutualInformation ::KModeTypes kmode_3off2__
the penalty used in 3off2
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54
+ Here is the call graph for this function:

◆ useNoApriori()

INLINE void gum::learning::genericBNLearner::useNoApriori ( )
inherited

use no apriori

Definition at line 460 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

460  {
463  }
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
AprioriType apriori_type__
the a priori selected for the score and parameters
+ Here is the call graph for this function:

◆ useNoCorr()

INLINE void gum::learning::genericBNLearner::useNoCorr ( )
inherited

indicate that we wish to use the NoCorr correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 255 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

255  {
257  GUM_ERROR(OperationNotAllowed,
258  "You must use the 3off2 algorithm before selecting "
259  << "the NoCorr score");
260  }
262  }
AlgoType selected_algo__
the selected learning algorithm
CorrectedMutualInformation ::KModeTypes kmode_3off2__
the penalty used in 3off2
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54
+ Here is the call graph for this function:

◆ useScoreAIC()

INLINE void gum::learning::genericBNLearner::useScoreAIC ( )
inherited

indicate that we wish to use an AIC score

Definition at line 182 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

182  {
185  }
ScoreType score_type__
the score selected for learning
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreBD()

INLINE void gum::learning::genericBNLearner::useScoreBD ( )
inherited

indicate that we wish to use a BD score

Definition at line 188 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

188  {
191  }
ScoreType score_type__
the score selected for learning
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreBDeu()

INLINE void gum::learning::genericBNLearner::useScoreBDeu ( )
inherited

indicate that we wish to use a BDeu score

Definition at line 194 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

194  {
197  }
ScoreType score_type__
the score selected for learning
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreBIC()

INLINE void gum::learning::genericBNLearner::useScoreBIC ( )
inherited

indicate that we wish to use a BIC score

Definition at line 200 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

200  {
203  }
ScoreType score_type__
the score selected for learning
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreK2()

INLINE void gum::learning::genericBNLearner::useScoreK2 ( )
inherited

indicate that we wish to use a K2 score

Definition at line 206 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

206  {
209  }
ScoreType score_type__
the score selected for learning
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreLog2Likelihood()

INLINE void gum::learning::genericBNLearner::useScoreLog2Likelihood ( )
inherited

indicate that we wish to use a Log2Likelihood score

Definition at line 212 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

212  {
215  }
ScoreType score_type__
the score selected for learning
std::string checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ verbosity()

bool gum::learning::genericBNLearner::verbosity ( ) const
inlinevirtualinherited

verbosity

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1116 of file genericBNLearner.h.

1116  {
1117  if (current_algorithm__ != nullptr)
1118  return current_algorithm__->verbosity();
1119  else
1120  GUM_ERROR(FatalError, "No chosen algorithm for learning");
1121  };
const ApproximationScheme * current_algorithm__
bool verbosity() const
Returns true if verbosity is enabled.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:54

Member Data Documentation

◆ apriori__

Apriori* gum::learning::genericBNLearner::apriori__ {nullptr}
protectedinherited

the apriori used

Definition at line 773 of file genericBNLearner.h.

◆ apriori_database__

Database* gum::learning::genericBNLearner::apriori_database__ {nullptr}
protectedinherited

the database used by the Dirichlet a priori

Definition at line 827 of file genericBNLearner.h.

◆ apriori_dbname__

std::string gum::learning::genericBNLearner::apriori_dbname__
protectedinherited

the filename for the Dirichlet a priori, if any

Definition at line 830 of file genericBNLearner.h.

◆ apriori_type__

AprioriType gum::learning::genericBNLearner::apriori_type__ {AprioriType::NO_APRIORI}
protectedinherited

the a priori selected for the score and parameters

Definition at line 770 of file genericBNLearner.h.

◆ apriori_weight__

double gum::learning::genericBNLearner::apriori_weight__ {1.0f}
protectedinherited

the weight of the apriori

Definition at line 778 of file genericBNLearner.h.

◆ constraint_ForbiddenArcs__

StructuralConstraintForbiddenArcs gum::learning::genericBNLearner::constraint_ForbiddenArcs__
protectedinherited

the constraint on forbidden arcs

Definition at line 790 of file genericBNLearner.h.

◆ constraint_Indegree__

StructuralConstraintIndegree gum::learning::genericBNLearner::constraint_Indegree__
protectedinherited

the constraint for indegrees

Definition at line 784 of file genericBNLearner.h.

◆ constraint_MandatoryArcs__

StructuralConstraintMandatoryArcs gum::learning::genericBNLearner::constraint_MandatoryArcs__
protectedinherited

the constraint on forbidden arcs

Definition at line 796 of file genericBNLearner.h.

◆ constraint_PossibleEdges__

StructuralConstraintPossibleEdges gum::learning::genericBNLearner::constraint_PossibleEdges__
protectedinherited

the constraint on possible Edges

Definition at line 793 of file genericBNLearner.h.

◆ constraint_SliceOrder__

StructuralConstraintSliceOrder gum::learning::genericBNLearner::constraint_SliceOrder__
protectedinherited

the constraint for 2TBNs

Definition at line 781 of file genericBNLearner.h.

◆ constraint_TabuList__

StructuralConstraintTabuList gum::learning::genericBNLearner::constraint_TabuList__
protectedinherited

the constraint for tabu lists

Definition at line 787 of file genericBNLearner.h.

◆ current_algorithm__

const ApproximationScheme* gum::learning::genericBNLearner::current_algorithm__ {nullptr}
protectedinherited

Definition at line 836 of file genericBNLearner.h.

◆ Dag2BN__

DAG2BNLearner gum::learning::genericBNLearner::Dag2BN__
protectedinherited

the parametric EM

Definition at line 812 of file genericBNLearner.h.

◆ EMepsilon__

double gum::learning::genericBNLearner::EMepsilon__ {0.0}
protectedinherited

epsilon for EM. if espilon=0.0 : no EM

Definition at line 764 of file genericBNLearner.h.

◆ greedy_hill_climbing__

GreedyHillClimbing gum::learning::genericBNLearner::greedy_hill_climbing__
protectedinherited

the greedy hill climbing algorithm

Definition at line 815 of file genericBNLearner.h.

◆ initial_dag__

DAG gum::learning::genericBNLearner::initial_dag__
protectedinherited

an initial DAG given to learners

Definition at line 833 of file genericBNLearner.h.

◆ K2__

K2 gum::learning::genericBNLearner::K2__
protectedinherited

the K2 algorithm

Definition at line 802 of file genericBNLearner.h.

◆ kmode_3off2__

CorrectedMutualInformation ::KModeTypes gum::learning::genericBNLearner::kmode_3off2__
protectedinherited
Initial value:

the penalty used in 3off2

Definition at line 808 of file genericBNLearner.h.

◆ local_search_with_tabu_list__

LocalSearchWithTabuList gum::learning::genericBNLearner::local_search_with_tabu_list__
protectedinherited

the local search with tabu list algorithm

Definition at line 818 of file genericBNLearner.h.

◆ miic_3off2__

Miic gum::learning::genericBNLearner::miic_3off2__
protectedinherited

the 3off2 algorithm

Definition at line 805 of file genericBNLearner.h.

◆ mutual_info__

CorrectedMutualInformation* gum::learning::genericBNLearner::mutual_info__ {nullptr}
protectedinherited

the selected correction for 3off2 and miic

Definition at line 767 of file genericBNLearner.h.

◆ no_apriori__

AprioriNoApriori* gum::learning::genericBNLearner::no_apriori__ {nullptr}
protectedinherited

Definition at line 775 of file genericBNLearner.h.

◆ onProgress

Signaler3< Size, double, double > gum::IApproximationSchemeConfiguration::onProgress
inherited

Progression, error and time.

Definition at line 58 of file IApproximationSchemeConfiguration.h.

◆ onStop

Signaler1< std::string > gum::IApproximationSchemeConfiguration::onStop
inherited

Criteria messageApproximationScheme.

Definition at line 61 of file IApproximationSchemeConfiguration.h.

◆ param_estimator_type__

ParamEstimatorType gum::learning::genericBNLearner::param_estimator_type__ {ParamEstimatorType::ML}
protectedinherited

the type of the parameter estimator

Definition at line 761 of file genericBNLearner.h.

◆ ranges__

std::vector< std::pair< std::size_t, std::size_t > > gum::learning::genericBNLearner::ranges__
protectedinherited

the set of rows' ranges within the database in which learning is done

Definition at line 824 of file genericBNLearner.h.

◆ score__

Score* gum::learning::genericBNLearner::score__ {nullptr}
protectedinherited

the score used

Definition at line 758 of file genericBNLearner.h.

◆ score_database__

Database gum::learning::genericBNLearner::score_database__
protectedinherited

the database to be used by the scores and parameter estimators

Definition at line 821 of file genericBNLearner.h.

◆ score_type__

ScoreType gum::learning::genericBNLearner::score_type__ {ScoreType::BDeu}
protectedinherited

the score selected for learning

Definition at line 755 of file genericBNLearner.h.

◆ selected_algo__

AlgoType gum::learning::genericBNLearner::selected_algo__ {AlgoType::GREEDY_HILL_CLIMBING}
protectedinherited

the selected learning algorithm

Definition at line 799 of file genericBNLearner.h.


The documentation for this class was generated from the following file: