aGrUM  0.21.0
a C++ library for (probabilistic) graphical models
gum::learning::BNLearner< GUM_SCALAR > Class Template Reference

A pack of learning algorithms that can easily be used. More...

#include <BNLearner.h>

+ Inheritance diagram for gum::learning::BNLearner< GUM_SCALAR >:
+ Collaboration diagram for gum::learning::BNLearner< GUM_SCALAR >:

Public Attributes

Signaler3< Size, double, doubleonProgress
 Progression, error and time. More...
 
Signaler1< std::string > onStop
 Criteria messageApproximationScheme. More...
 

Public Member Functions

BayesNet< GUM_SCALAR > learnBN ()
 learn a Bayes Net from a file (must have read the db before) More...
 
BayesNet< GUM_SCALAR > learnParameters (const DAG &dag, bool takeIntoAccountScore=true)
 learns a BN (its parameters) when its structure is known More...
 
BayesNet< GUM_SCALAR > learnParameters (bool take_into_account_score=true)
 
std::string toString () const
 
std::vector< std::tuple< std::string, std::string, std::string > > state () const
 
void _setAprioriWeight_ (double weight)
 sets the apriori weight More...
 
void setMandatoryArcs (const ArcSet &set)
 assign a set of forbidden arcs More...
 
Constructors / Destructors
 BNLearner (const std::string &filename, const std::vector< std::string > &missing_symbols={"?"})
 default constructor More...
 
 BNLearner (const DatabaseTable<> &db)
 default constructor More...
 
 BNLearner (const std::string &filename, const gum::BayesNet< GUM_SCALAR > &src, const std::vector< std::string > &missing_symbols={"?"})
 Read the database file for the score / parameter estimation and var names. More...
 
 BNLearner (const BNLearner &)
 copy constructor More...
 
 BNLearner (BNLearner &&)
 move constructor More...
 
virtual ~BNLearner ()
 destructor More...
 
Operators
BNLearneroperator= (const BNLearner &)
 copy operator More...
 
BNLearneroperator= (BNLearner &&)
 move operator More...
 
Accessors / Modifiers
DAG learnDAG ()
 learn a structure from a file (must have read the db before) More...
 
MixedGraph learnMixedStructure ()
 learn a partial structure from a file (must have read the db before and must have selected miic or 3off2) More...
 
void setInitialDAG (const DAG &)
 sets an initial DAG structure More...
 
DAG initialDAG ()
 returns the initial DAG structure More...
 
const std::vector< std::string > & names () const
 returns the names of the variables in the database More...
 
const std::vector< std::size_t > & domainSizes () const
 returns the domain sizes of the variables in the database More...
 
Size domainSize (NodeId var) const
 learn a structure from a file (must have read the db before) More...
 
Size domainSize (const std::string &var) const
 learn a structure from a file (must have read the db before) More...
 
NodeId idFromName (const std::string &var_name) const
 returns the node id corresponding to a variable name More...
 
const DatabaseTabledatabase () const
 returns the database used by the BNLearner More...
 
void setDatabaseWeight (const double new_weight)
 assign a weight to all the rows of the learning database so that the sum of their weights is equal to new_weight More...
 
void setRecordWeight (const std::size_t i, const double weight)
 sets the weight of the ith record of the database More...
 
double recordWeight (const std::size_t i) const
 returns the weight of the ith record More...
 
double databaseWeight () const
 returns the weight of the whole database More...
 
const std::string & nameFromId (NodeId id) const
 returns the variable name corresponding to a given node id More...
 
template<template< typename > class XALLOC>
void useDatabaseRanges (const std::vector< std::pair< std::size_t, std::size_t >, XALLOC< std::pair< std::size_t, std::size_t > > > &new_ranges)
 use a new set of database rows' ranges to perform learning More...
 
void clearDatabaseRanges ()
 reset the ranges to the one range corresponding to the whole database More...
 
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges () const
 returns the current database rows' ranges used for learning More...
 
std::pair< std::size_t, std::size_t > useCrossValidationFold (const std::size_t learning_fold, const std::size_t k_fold)
 sets the ranges of rows to be used for cross-validation learning More...
 
std::pair< double, doublechi2 (const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
 Return the <statistic,pvalue> pair for chi2 test in the database. More...
 
std::pair< double, doublechi2 (const std::string &name1, const std::string &name2, const std::vector< std::string > &knowing={})
 Return the <statistic,pvalue> pair for the BNLearner. More...
 
std::pair< double, doubleG2 (const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
 Return the <statistic,pvalue> pair for for G2 test in the database. More...
 
std::pair< double, doubleG2 (const std::string &name1, const std::string &name2, const std::vector< std::string > &knowing={})
 Return the <statistic,pvalue> pair for for G2 test in the database. More...
 
double logLikelihood (const std::vector< NodeId > &vars, const std::vector< NodeId > &knowing={})
 Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner. More...
 
double logLikelihood (const std::vector< std::string > &vars, const std::vector< std::string > &knowing={})
 Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner. More...
 
std::vector< doublerawPseudoCount (const std::vector< NodeId > &vars)
 Return the pseudoconts ofNodeIds vars in the base in a raw array. More...
 
std::vector< doublerawPseudoCount (const std::vector< std::string > &vars)
 Return the pseudoconts of vars in the base in a raw array. More...
 
Size nbCols () const
 
Size nbRows () const
 
void useEM (const double epsilon)
 use The EM algorithm to learn paramters More...
 
bool hasMissingValues () const
 returns true if the learner's database has missing values More...
 
Score selection
void useScoreAIC ()
 indicate that we wish to use an AIC score More...
 
void useScoreBD ()
 indicate that we wish to use a BD score More...
 
void useScoreBDeu ()
 indicate that we wish to use a BDeu score More...
 
void useScoreBIC ()
 indicate that we wish to use a BIC score More...
 
void useScoreK2 ()
 indicate that we wish to use a K2 score More...
 
void useScoreLog2Likelihood ()
 indicate that we wish to use a Log2Likelihood score More...
 
A priori selection / parameterization
void useNoApriori ()
 use no apriori More...
 
void useAprioriBDeu (double weight=1)
 use the BDeu apriori More...
 
void useAprioriSmoothing (double weight=1)
 use the apriori smoothing More...
 
void useAprioriDirichlet (const std::string &filename, double weight=1)
 use the Dirichlet apriori More...
 
std::string checkScoreAprioriCompatibility () const
 checks whether the current score and apriori are compatible More...
 
Learning algorithm selection
void useGreedyHillClimbing ()
 indicate that we wish to use a greedy hill climbing algorithm More...
 
void useLocalSearchWithTabuList (Size tabu_size=100, Size nb_decrease=2)
 indicate that we wish to use a local search with tabu list More...
 
void useK2 (const Sequence< NodeId > &order)
 indicate that we wish to use K2 More...
 
void useK2 (const std::vector< NodeId > &order)
 indicate that we wish to use K2 More...
 
void use3off2 ()
 indicate that we wish to use 3off2 More...
 
void useMIIC ()
 indicate that we wish to use MIIC More...
 
3off2/MIIC parameterization and specific results
void useNMLCorrection ()
 indicate that we wish to use the NML correction for 3off2 and MIIC More...
 
void useMDLCorrection ()
 indicate that we wish to use the MDL correction for 3off2 and MIIC More...
 
void useNoCorrection ()
 indicate that we wish to use the NoCorr correction for 3off2 and MIIC More...
 
const std::vector< ArclatentVariables () const
 get the list of arcs hiding latent variables More...
 
Accessors / Modifiers for adding constraints on learning
void setMaxIndegree (Size max_indegree)
 sets the max indegree More...
 
void setSliceOrder (const NodeProperty< NodeId > &slice_order)
 sets a partial order on the nodes More...
 
void setSliceOrder (const std::vector< std::vector< std::string > > &slices)
 sets a partial order on the nodes More...
 
void setForbiddenArcs (const ArcSet &set)
 assign a set of forbidden arcs More...
 
assign a new forbidden arc
void addForbiddenArc (const Arc &arc)
 
void addForbiddenArc (const NodeId tail, const NodeId head)
 
void addForbiddenArc (const std::string &tail, const std::string &head)
 
void addMandatoryArc (const Arc &arc)
 
void addMandatoryArc (const NodeId tail, const NodeId head)
 
void addMandatoryArc (const std::string &tail, const std::string &head)
 
remove a forbidden arc
void eraseForbiddenArc (const Arc &arc)
 
void eraseForbiddenArc (const NodeId tail, const NodeId head)
 
void eraseForbiddenArc (const std::string &tail, const std::string &head)
 
void eraseMandatoryArc (const Arc &arc)
 
void eraseMandatoryArc (const NodeId tail, const NodeId head)
 
void eraseMandatoryArc (const std::string &tail, const std::string &head)
 
void setPossibleEdges (const EdgeSet &set)
 assign a set of forbidden edges More...
 
void setPossibleSkeleton (const UndiGraph &skeleton)
 assign a set of forbidden edges More...
 
assign a new possible edge
Warning
Once at least one possible edge is defined, all other edges are not possible anymore
void addPossibleEdge (const Edge &edge)
 
void addPossibleEdge (const NodeId tail, const NodeId head)
 
void addPossibleEdge (const std::string &tail, const std::string &head)
 
remove a possible edge
void erasePossibleEdge (const Edge &edge)
 
void erasePossibleEdge (const NodeId tail, const NodeId head)
 
void erasePossibleEdge (const std::string &tail, const std::string &head)
 
redistribute signals AND implementation of interface
INLINE void setCurrentApproximationScheme (const ApproximationScheme *approximationScheme)
 {@ /// distribute signals More...
 
INLINE void distributeProgress (const ApproximationScheme *approximationScheme, Size pourcent, double error, double time)
 {@ /// distribute signals More...
 
INLINE void distributeStop (const ApproximationScheme *approximationScheme, std::string message)
 distribute signals More...
 
void setEpsilon (double eps)
 Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)| If the criterion was disabled it will be enabled. More...
 
double epsilon () const
 Get the value of epsilon. More...
 
void disableEpsilon ()
 Disable stopping criterion on epsilon. More...
 
void enableEpsilon ()
 Enable stopping criterion on epsilon. More...
 
bool isEnabledEpsilon () const
 
void setMinEpsilonRate (double rate)
 Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|) If the criterion was disabled it will be enabled. More...
 
double minEpsilonRate () const
 Get the value of the minimal epsilon rate. More...
 
void disableMinEpsilonRate ()
 Disable stopping criterion on epsilon rate. More...
 
void enableMinEpsilonRate ()
 Enable stopping criterion on epsilon rate. More...
 
bool isEnabledMinEpsilonRate () const
 
void setMaxIter (Size max)
 stopping criterion on number of iterationsIf the criterion was disabled it will be enabled More...
 
Size maxIter () const
 
void disableMaxIter ()
 Disable stopping criterion on max iterations. More...
 
void enableMaxIter ()
 Enable stopping criterion on max iterations. More...
 
bool isEnabledMaxIter () const
 
void setMaxTime (double timeout)
 stopping criterion on timeout If the criterion was disabled it will be enabled More...
 
double maxTime () const
 returns the timeout (in seconds) More...
 
double currentTime () const
 get the current running time in second (double) More...
 
void disableMaxTime ()
 Disable stopping criterion on timeout. More...
 
void enableMaxTime ()
 stopping criterion on timeout If the criterion was disabled it will be enabled More...
 
bool isEnabledMaxTime () const
 
void setPeriodSize (Size p)
 how many samples between 2 stopping isEnableds More...
 
Size periodSize () const
 how many samples between 2 stopping isEnableds More...
 
void setVerbosity (bool v)
 verbosity More...
 
bool verbosity () const
 verbosity More...
 
ApproximationSchemeSTATE stateApproximationScheme () const
 history More...
 
Size nbrIterations () const
 
const std::vector< double > & history () const
 
Getters and setters
std::string messageApproximationScheme () const
 Returns the approximation scheme message. More...
 

Public Types

enum  ScoreType {
  ScoreType::AIC, ScoreType::BD, ScoreType::BDeu, ScoreType::BIC,
  ScoreType::K2, ScoreType::LOG2LIKELIHOOD
}
 an enumeration enabling to select easily the score we wish to use More...
 
enum  ParamEstimatorType { ParamEstimatorType::ML }
 an enumeration to select the type of parameter estimation we shall apply More...
 
enum  AprioriType { AprioriType::NO_APRIORI, AprioriType::SMOOTHING, AprioriType::DIRICHLET_FROM_DATABASE, AprioriType::BDEU }
 an enumeration to select the apriori More...
 
enum  AlgoType {
  AlgoType::K2, AlgoType::GREEDY_HILL_CLIMBING, AlgoType::LOCAL_SEARCH_WITH_TABU_LIST, AlgoType::MIIC,
  AlgoType::THREE_OFF_TWO
}
 an enumeration to select easily the learning algorithm to use More...
 
enum  ApproximationSchemeSTATE : char {
  ApproximationSchemeSTATE::Undefined, ApproximationSchemeSTATE::Continue, ApproximationSchemeSTATE::Epsilon, ApproximationSchemeSTATE::Rate,
  ApproximationSchemeSTATE::Limit, ApproximationSchemeSTATE::TimeLimit, ApproximationSchemeSTATE::Stopped
}
 The different state of an approximation scheme. More...
 

Protected Attributes

ScoreType scoreType_ {ScoreType::BDeu}
 the score selected for learning More...
 
Scorescore_ {nullptr}
 the score used More...
 
ParamEstimatorType paramEstimatorType_ {ParamEstimatorType::ML}
 the type of the parameter estimator More...
 
double epsilonEM_ {0.0}
 epsilon for EM. if espilon=0.0 : no EM More...
 
CorrectedMutualInformationmutualInfo_ {nullptr}
 the selected correction for 3off2 and miic More...
 
AprioriType aprioriType_ {AprioriType::NO_APRIORI}
 the a priori selected for the score and parameters More...
 
Aprioriapriori_ {nullptr}
 the apriori used More...
 
AprioriNoApriorinoApriori_ {nullptr}
 
double aprioriWeight_ {1.0f}
 the weight of the apriori More...
 
StructuralConstraintSliceOrder constraintSliceOrder_
 the constraint for 2TBNs More...
 
StructuralConstraintIndegree constraintIndegree_
 the constraint for indegrees More...
 
StructuralConstraintTabuList constraintTabuList_
 the constraint for tabu lists More...
 
StructuralConstraintForbiddenArcs constraintForbiddenArcs_
 the constraint on forbidden arcs More...
 
StructuralConstraintPossibleEdges constraintPossibleEdges_
 the constraint on possible Edges More...
 
StructuralConstraintMandatoryArcs constraintMandatoryArcs_
 the constraint on mandatory arcs More...
 
AlgoType selectedAlgo_ {AlgoType::GREEDY_HILL_CLIMBING}
 the selected learning algorithm More...
 
K2 algoK2_
 the K2 algorithm More...
 
Miic algoMiic3off2_
 the MIIC or 3off2 algorithm More...
 
CorrectedMutualInformation ::KModeTypes kmode3Off2_
 the penalty used in 3off2 More...
 
DAG2BNLearner Dag2BN_
 the parametric EM More...
 
GreedyHillClimbing greedyHillClimbing_
 the greedy hill climbing algorithm More...
 
LocalSearchWithTabuList localSearchWithTabuList_
 the local search with tabu list algorithm More...
 
Database scoreDatabase_
 the database to be used by the scores and parameter estimators More...
 
std::vector< std::pair< std::size_t, std::size_t > > ranges_
 the set of rows' ranges within the database in which learning is done More...
 
DatabaseaprioriDatabase_ {nullptr}
 the database used by the Dirichlet a priori More...
 
std::string aprioriDbname_
 the filename for the Dirichlet a priori, if any More...
 
DAG initialDag_
 an initial DAG given to learners More...
 
std::string filename_
 the filename database More...
 
Size nbDecreasingChanges_ {2}
 
const ApproximationSchemecurrentAlgorithm_ {nullptr}
 

Protected Member Functions

void createApriori_ ()
 create the apriori used for learning More...
 
void createScore_ ()
 create the score used for learning More...
 
ParamEstimatorcreateParamEstimator_ (DBRowGeneratorParser<> &parser, bool take_into_account_score=true)
 create the parameter estimator used for learning More...
 
DAG learnDag_ ()
 returns the DAG learnt More...
 
MixedGraph prepareMiic3Off2_ ()
 prepares the initial graph for 3off2 or miic More...
 
const std::string & getAprioriType_ () const
 returns the type (as a string) of a given apriori More...
 
void createCorrectedMutualInformation_ ()
 create the Corrected Mutual Information instance for Miic/3off2 More...
 

Static Protected Member Functions

static DatabaseTable readFile_ (const std::string &filename, const std::vector< std::string > &missing_symbols)
 reads a file and returns a databaseVectInRam More...
 
static void isCSVFileName_ (const std::string &filename)
 checks whether the extension of a CSV filename is correct More...
 

Detailed Description

template<typename GUM_SCALAR>
class gum::learning::BNLearner< GUM_SCALAR >

A pack of learning algorithms that can easily be used.

The pack currently contains K2, GreedyHillClimbing and LocalSearchWithTabuList

Definition at line 59 of file BNLearner.h.

Member Enumeration Documentation

◆ AlgoType

an enumeration to select easily the learning algorithm to use

Enumerator
K2 
GREEDY_HILL_CLIMBING 
LOCAL_SEARCH_WITH_TABU_LIST 
MIIC 
THREE_OFF_TWO 

Definition at line 137 of file genericBNLearner.h.

138  {
139  K2,
140  GREEDY_HILL_CLIMBING,
141  LOCAL_SEARCH_WITH_TABU_LIST,
142  MIIC,
143  THREE_OFF_TWO
144  };

◆ ApproximationSchemeSTATE

The different state of an approximation scheme.

Enumerator
Undefined 
Continue 
Epsilon 
Rate 
Limit 
TimeLimit 
Stopped 

Definition at line 64 of file IApproximationSchemeConfiguration.h.

64  : char
65  {
66  Undefined,
67  Continue,
68  Epsilon,
69  Rate,
70  Limit,
71  TimeLimit,
72  Stopped
73  };

◆ AprioriType

an enumeration to select the apriori

Enumerator
NO_APRIORI 
SMOOTHING 
DIRICHLET_FROM_DATABASE 
BDEU 

Definition at line 128 of file genericBNLearner.h.

129  {
130  NO_APRIORI,
131  SMOOTHING,
132  DIRICHLET_FROM_DATABASE,
133  BDEU
134  };

◆ ParamEstimatorType

an enumeration to select the type of parameter estimation we shall apply

Enumerator
ML 

Definition at line 122 of file genericBNLearner.h.

123  {
124  ML
125  };

◆ ScoreType

an enumeration enabling to select easily the score we wish to use

Enumerator
AIC 
BD 
BDeu 
BIC 
K2 
LOG2LIKELIHOOD 

Definition at line 110 of file genericBNLearner.h.

111  {
112  AIC,
113  BD,
114  BDeu,
115  BIC,
116  K2,
117  LOG2LIKELIHOOD
118  };

Constructor & Destructor Documentation

◆ BNLearner() [1/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const std::string &  filename,
const std::vector< std::string > &  missing_symbols = {"?"} 
)

default constructor

read the database file for the score / parameter estimation and var names

◆ BNLearner() [2/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const DatabaseTable<> &  db)

default constructor

read the database file for the score / parameter estimation and var names

◆ BNLearner() [3/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const std::string &  filename,
const gum::BayesNet< GUM_SCALAR > &  src,
const std::vector< std::string > &  missing_symbols = {"?"} 
)

Read the database file for the score / parameter estimation and var names.

If modalities = { 1 -> {True, False, Big} }, then the node of id 1 in the BN will have 3 modalities, the first one being True, the second one being False, and the third bein Big.

A parsing of the database will allow to determine which ones are really necessary and will keep them in the order specified by the user (NodeProperty modalities). If parse_database is set to false (the default), then the modalities specified by the user will be considered as being exactly those of the variables of the BN (as a consequence, if we find other values in the database, an exception will be raised during learning).

Parameters
filenameThe file to learn from.
modalitiesindicate for some nodes (not necessarily all the nodes of the BN) which modalities they should have and in which order these modalities should be stored into the nodes.
parse_databaseif true, the modalities specified by the user will be considered as a superset of the modalities of the variables. Wrapper for BNLearner (filename,modalities,parse_database) using a bn to find those modalities and nodeids.

◆ BNLearner() [4/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const BNLearner< GUM_SCALAR > &  )

copy constructor

◆ BNLearner() [5/5]

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( BNLearner< GUM_SCALAR > &&  )

move constructor

◆ ~BNLearner()

template<typename GUM_SCALAR >
virtual gum::learning::BNLearner< GUM_SCALAR >::~BNLearner ( )
virtual

destructor

Member Function Documentation

◆ _labelsFromBN_()

template<typename GUM_SCALAR >
NodeProperty< Sequence< std::string > > gum::learning::BNLearner< GUM_SCALAR >::_labelsFromBN_ ( const std::string &  filename,
const BayesNet< GUM_SCALAR > &  src 
)
private

read the first line of a file to find column names

◆ _setAprioriWeight_()

INLINE void gum::learning::genericBNLearner::_setAprioriWeight_ ( double  weight)
inherited

sets the apriori weight

Definition at line 400 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

400  {
401  if (weight < 0) { GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive") }
402 
403  aprioriWeight_ = weight;
405  }
double aprioriWeight_
the weight of the apriori
std::string checkScoreAprioriCompatibility() const
checks whether the current score and apriori are compatible
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
+ Here is the call graph for this function:

◆ addForbiddenArc() [1/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const Arc arc)
inherited

Definition at line 313 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

313  {
315  }
void addArc(const Arc &arc)
assign a new forbidden arc
StructuralConstraintForbiddenArcs constraintForbiddenArcs_
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ addForbiddenArc() [2/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 323 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

323  {
324  addForbiddenArc(Arc(tail, head));
325  }
+ Here is the call graph for this function:

◆ addForbiddenArc() [3/3]

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 333 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

334  {
335  addForbiddenArc(Arc(idFromName(tail), idFromName(head)));
336  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ addMandatoryArc() [1/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const Arc arc)
inherited

Definition at line 350 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

350  {
352  }
void addArc(const Arc &arc)
assign a new forbidden arc
StructuralConstraintMandatoryArcs constraintMandatoryArcs_
the constraint on mandatory arcs
+ Here is the call graph for this function:

◆ addMandatoryArc() [2/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 372 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

372  {
373  addMandatoryArc(Arc(tail, head));
374  }
+ Here is the call graph for this function:

◆ addMandatoryArc() [3/3]

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 360 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

361  {
362  addMandatoryArc(Arc(idFromName(tail), idFromName(head)));
363  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ addPossibleEdge() [1/3]

INLINE void gum::learning::genericBNLearner::addPossibleEdge ( const Edge edge)
inherited

Definition at line 276 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

276  {
278  }
void addEdge(const Edge &edge)
assign a new forbidden arc
StructuralConstraintPossibleEdges constraintPossibleEdges_
the constraint on possible Edges
+ Here is the call graph for this function:

◆ addPossibleEdge() [2/3]

INLINE void gum::learning::genericBNLearner::addPossibleEdge ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 286 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

286  {
287  addPossibleEdge(Edge(tail, head));
288  }
void addPossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ addPossibleEdge() [3/3]

INLINE void gum::learning::genericBNLearner::addPossibleEdge ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 296 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

297  {
298  addPossibleEdge(Edge(idFromName(tail), idFromName(head)));
299  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
void addPossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ checkScoreAprioriCompatibility()

std::string gum::learning::genericBNLearner::checkScoreAprioriCompatibility ( ) const
inherited

checks whether the current score and apriori are compatible

Returns
a non empty string if the apriori is somehow compatible with the score.

Definition at line 828 of file genericBNLearner.cpp.

828  {
829  const std::string& apriori = getAprioriType_();
830 
831  switch (scoreType_) {
832  case ScoreType::AIC:
834 
835  case ScoreType::BD:
837 
838  case ScoreType::BDeu:
840 
841  case ScoreType::BIC:
843 
844  case ScoreType::K2:
846 
849 
850  default:
851  return "genericBNLearner does not support yet this score";
852  }
853  }
double aprioriWeight_
the weight of the apriori
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
ScoreType scoreType_
the score selected for learning
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
const std::string & getAprioriType_() const
returns the type (as a string) of a given apriori
virtual std::string isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score

◆ chi2() [1/2]

std::pair< double, double > gum::learning::genericBNLearner::chi2 ( const NodeId  id1,
const NodeId  id2,
const std::vector< NodeId > &  knowing = {} 
)
inherited

Return the <statistic,pvalue> pair for chi2 test in the database.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 897 of file genericBNLearner.cpp.

899  {
900  createApriori_();
902  *apriori_,
903  databaseRanges());
904 
905  return chi2score.statistics(id1, id2, knowing);
906  }
the class for computing Chi2 independence test scores
Definition: indepTestChi2.h:47
std::pair< double, double > statistics(NodeId var1, NodeId var2, const std::vector< NodeId, ALLOC< NodeId > > &rhs_ids={})
get the pair <chi2 statistic,pvalue> for a test var1 indep var2 given rhs_ids
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
void createApriori_()
create the apriori used for learning
DBRowGeneratorParser & parser()
returns the parser for the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
Apriori * apriori_
the apriori used

◆ chi2() [2/2]

std::pair< double, double > gum::learning::genericBNLearner::chi2 ( const std::string &  name1,
const std::string &  name2,
const std::vector< std::string > &  knowing = {} 
)
inherited

Return the <statistic,pvalue> pair for the BNLearner.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 908 of file genericBNLearner.cpp.

910  {
911  std::vector< NodeId > knowingIds;
912  std::transform(knowing.begin(),
913  knowing.end(),
914  std::back_inserter(knowingIds),
915  [this](const std::string& c) -> NodeId { return this->idFromName(c); });
916  return chi2(idFromName(name1), idFromName(name2), knowingIds);
917  }
std::pair< double, double > chi2(const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
Return the <statistic,pvalue> pair for chi2 test in the database.
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Size NodeId
Type for node ids.
Definition: graphElements.h:97

◆ clearDatabaseRanges()

INLINE void gum::learning::genericBNLearner::clearDatabaseRanges ( )
inherited

reset the ranges to the one range corresponding to the whole database

Definition at line 494 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

494 { ranges_.clear(); }
std::vector< std::pair< std::size_t, std::size_t > > ranges_
the set of rows&#39; ranges within the database in which learning is done
+ Here is the call graph for this function:

◆ createApriori_()

void gum::learning::genericBNLearner::createApriori_ ( )
protectedinherited

create the apriori used for learning

Definition at line 439 of file genericBNLearner.cpp.

439  {
440  // first, save the old apriori, to be delete if everything is ok
441  Apriori<>* old_apriori = apriori_;
442 
443  // create the new apriori
444  switch (aprioriType_) {
446  apriori_ = new AprioriNoApriori<>(scoreDatabase_.databaseTable(),
448  break;
449 
451  apriori_ = new AprioriSmoothing<>(scoreDatabase_.databaseTable(),
453  break;
454 
456  if (aprioriDatabase_ != nullptr) {
457  delete aprioriDatabase_;
458  aprioriDatabase_ = nullptr;
459  }
460 
463 
464  apriori_ = new AprioriDirichletFromDatabase<>(scoreDatabase_.databaseTable(),
467  break;
468 
469  case AprioriType::BDEU:
470  apriori_
471  = new AprioriBDeu<>(scoreDatabase_.databaseTable(), scoreDatabase_.nodeId2Columns());
472  break;
473 
474  default:
475  GUM_ERROR(OperationNotAllowed, "The BNLearner does not support yet this apriori")
476  }
477 
478  // do not forget to assign a weight to the apriori
479  apriori_->setWeight(aprioriWeight_);
480 
481  // remove the old apriori, if any
482  if (old_apriori != nullptr) delete old_apriori;
483  }
double aprioriWeight_
the weight of the apriori
std::string aprioriDbname_
the filename for the Dirichlet a priori, if any
const std::vector< std::string > & missingSymbols() const
returns the set of missing symbols taken into account
Database * aprioriDatabase_
the database used by the Dirichlet a priori
const DatabaseTable & databaseTable() const
returns the internal database table
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
DBRowGeneratorParser & parser()
returns the parser for the database
AprioriType aprioriType_
the a priori selected for the score and parameters
Database scoreDatabase_
the database to be used by the scores and parameter estimators
Apriori * apriori_
the apriori used
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ createCorrectedMutualInformation_()

void gum::learning::genericBNLearner::createCorrectedMutualInformation_ ( )
protectedinherited

create the Corrected Mutual Information instance for Miic/3off2

Definition at line 633 of file genericBNLearner.cpp.

633  {
634  if (mutualInfo_ != nullptr) delete mutualInfo_;
635 
636  mutualInfo_ = new CorrectedMutualInformation<>(scoreDatabase_.parser(),
637  *noApriori_,
638  ranges_,
640  switch (kmode3Off2_) {
642  mutualInfo_->useMDL();
643  break;
644 
646  mutualInfo_->useNML();
647  break;
648 
651  break;
652 
653  default:
654  GUM_ERROR(NotImplementedYet,
655  "The BNLearner's corrected mutual information class does "
656  << "not implement yet this correction : " << int(kmode3Off2_));
657  }
658  }
void useNML()
use the kNML penalty function
std::vector< std::pair< std::size_t, std::size_t > > ranges_
the set of rows&#39; ranges within the database in which learning is done
CorrectedMutualInformation * mutualInfo_
the selected correction for 3off2 and miic
CorrectedMutualInformation ::KModeTypes kmode3Off2_
the penalty used in 3off2
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
DBRowGeneratorParser & parser()
returns the parser for the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
void useMDL()
use the MDL penalty function
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
void useNoCorr()
use no correction/penalty function

◆ createParamEstimator_()

ParamEstimator * gum::learning::genericBNLearner::createParamEstimator_ ( DBRowGeneratorParser<> &  parser,
bool  take_into_account_score = true 
)
protectedinherited

create the parameter estimator used for learning

Definition at line 541 of file genericBNLearner.cpp.

542  {
543  ParamEstimator<>* param_estimator = nullptr;
544 
545  // create the new estimator
546  switch (paramEstimatorType_) {
548  if (take_into_account_score && (score_ != nullptr)) {
549  param_estimator = new ParamEstimatorML<>(parser,
550  *apriori_,
552  ranges_,
554  } else {
555  param_estimator = new ParamEstimatorML<>(parser,
556  *apriori_,
557  *noApriori_,
558  ranges_,
560  }
561 
562  break;
563 
564  default:
565  GUM_ERROR(OperationNotAllowed,
566  "genericBNLearner does not support "
567  << "yet this parameter estimator");
568  }
569 
570  // assign the set of ranges
571  param_estimator->setRanges(ranges_);
572 
573  return param_estimator;
574  }
std::vector< std::pair< std::size_t, std::size_t > > ranges_
the set of rows&#39; ranges within the database in which learning is done
ParamEstimatorType paramEstimatorType_
the type of the parameter estimator
virtual const Apriori< ALLOC > & internalApriori() const =0
returns the internal apriori of the score
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
Apriori * apriori_
the apriori used
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
Score * score_
the score used

◆ createScore_()

void gum::learning::genericBNLearner::createScore_ ( )
protectedinherited

create the score used for learning

Definition at line 485 of file genericBNLearner.cpp.

485  {
486  // first, save the old score, to be delete if everything is ok
487  Score<>* old_score = score_;
488 
489  // create the new scoring function
490  switch (scoreType_) {
491  case ScoreType::AIC:
492  score_ = new ScoreAIC<>(scoreDatabase_.parser(),
493  *apriori_,
494  ranges_,
496  break;
497 
498  case ScoreType::BD:
499  score_ = new ScoreBD<>(scoreDatabase_.parser(),
500  *apriori_,
501  ranges_,
503  break;
504 
505  case ScoreType::BDeu:
506  score_ = new ScoreBDeu<>(scoreDatabase_.parser(),
507  *apriori_,
508  ranges_,
510  break;
511 
512  case ScoreType::BIC:
513  score_ = new ScoreBIC<>(scoreDatabase_.parser(),
514  *apriori_,
515  ranges_,
517  break;
518 
519  case ScoreType::K2:
520  score_ = new ScoreK2<>(scoreDatabase_.parser(),
521  *apriori_,
522  ranges_,
524  break;
525 
527  score_ = new ScoreLog2Likelihood<>(scoreDatabase_.parser(),
528  *apriori_,
529  ranges_,
531  break;
532 
533  default:
534  GUM_ERROR(OperationNotAllowed, "genericBNLearner does not support yet this score")
535  }
536 
537  // remove the old score, if any
538  if (old_score != nullptr) delete old_score;
539  }
std::vector< std::pair< std::size_t, std::size_t > > ranges_
the set of rows&#39; ranges within the database in which learning is done
ScoreType scoreType_
the score selected for learning
const Bijection< NodeId, std::size_t > & nodeId2Columns() const
returns the mapping between node ids and their columns in the database
DBRowGeneratorParser & parser()
returns the parser for the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
Apriori * apriori_
the apriori used
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
Score * score_
the score used

◆ currentTime()

double gum::learning::genericBNLearner::currentTime ( ) const
inlinevirtualinherited

get the current running time in second (double)

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1061 of file genericBNLearner.h.

1061  {
1062  if (currentAlgorithm_ != nullptr)
1063  return currentAlgorithm_->currentTime();
1064  else
1065  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1066  }
const ApproximationScheme * currentAlgorithm_
double currentTime() const
Returns the current running time in second.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ database()

INLINE const DatabaseTable & gum::learning::genericBNLearner::database ( ) const
inherited

returns the database used by the BNLearner

Definition at line 497 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

497  {
498  return scoreDatabase_.databaseTable();
499  }
const DatabaseTable & databaseTable() const
returns the internal database table
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ databaseRanges()

INLINE const std::vector< std::pair< std::size_t, std::size_t > > & gum::learning::genericBNLearner::databaseRanges ( ) const
inherited

returns the current database rows' ranges used for learning

Returns
The method returns a vector of pairs [Xi,Yi) of indices of rows in the database. The learning is performed on these set of rows.
Warning
an empty set of ranges means the whole database.

Definition at line 489 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

489  {
490  return ranges_;
491  }
std::vector< std::pair< std::size_t, std::size_t > > ranges_
the set of rows&#39; ranges within the database in which learning is done
+ Here is the call graph for this function:

◆ databaseWeight()

INLINE double gum::learning::genericBNLearner::databaseWeight ( ) const
inherited

returns the weight of the whole database

Definition at line 153 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

153 { return scoreDatabase_.weight(); }
double weight(const std::size_t i) const
returns the weight of the ith record
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ disableEpsilon()

void gum::learning::genericBNLearner::disableEpsilon ( )
inlinevirtualinherited

Disable stopping criterion on epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 926 of file genericBNLearner.h.

926  {
931  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void disableEpsilon()
Disable stopping criterion on epsilon.
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ disableMaxIter()

void gum::learning::genericBNLearner::disableMaxIter ( )
inlinevirtualinherited

Disable stopping criterion on max iterations.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1016 of file genericBNLearner.h.

1016  {
1021  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
void disableMaxIter()
Disable stopping criterion on max iterations.
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ disableMaxTime()

void gum::learning::genericBNLearner::disableMaxTime ( )
inlinevirtualinherited

Disable stopping criterion on timeout.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1069 of file genericBNLearner.h.

1069  {
1074  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
void disableMaxTime()
Disable stopping criterion on timeout.
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ disableMinEpsilonRate()

void gum::learning::genericBNLearner::disableMinEpsilonRate ( )
inlinevirtualinherited

Disable stopping criterion on epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 972 of file genericBNLearner.h.

972  {
977  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void disableMinEpsilonRate()
Disable stopping criterion on epsilon rate.
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ distributeProgress()

INLINE void gum::learning::genericBNLearner::distributeProgress ( const ApproximationScheme approximationScheme,
Size  pourcent,
double  error,
double  time 
)
inlineinherited

{@ /// distribute signals

Definition at line 888 of file genericBNLearner.h.

891  {
892  setCurrentApproximationScheme(approximationScheme);
893 
894  if (onProgress.hasListener()) GUM_EMIT3(onProgress, pourcent, error, time);
895  };
INLINE void setCurrentApproximationScheme(const ApproximationScheme *approximationScheme)
{@ /// distribute signals
Signaler3< Size, double, double > onProgress
Progression, error and time.
#define GUM_EMIT3(signal, arg1, arg2, arg3)
Definition: signaler3.h:41

◆ distributeStop()

INLINE void gum::learning::genericBNLearner::distributeStop ( const ApproximationScheme approximationScheme,
std::string  message 
)
inlineinherited

distribute signals

Definition at line 898 of file genericBNLearner.h.

899  {
900  setCurrentApproximationScheme(approximationScheme);
901 
902  if (onStop.hasListener()) GUM_EMIT1(onStop, message);
903  };
INLINE void setCurrentApproximationScheme(const ApproximationScheme *approximationScheme)
{@ /// distribute signals
#define GUM_EMIT1(signal, arg1)
Definition: signaler1.h:41
Signaler1< std::string > onStop
Criteria messageApproximationScheme.

◆ domainSize() [1/2]

INLINE Size gum::learning::genericBNLearner::domainSize ( NodeId  var) const
inherited

learn a structure from a file (must have read the db before)

Definition at line 479 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

479  {
480  return scoreDatabase_.domainSizes()[var];
481  }
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ domainSize() [2/2]

INLINE Size gum::learning::genericBNLearner::domainSize ( const std::string &  var) const
inherited

learn a structure from a file (must have read the db before)

Definition at line 483 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

483  {
484  return scoreDatabase_.domainSizes()[idFromName(var)];
485  }
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ domainSizes()

INLINE const std::vector< std::size_t > & gum::learning::genericBNLearner::domainSizes ( ) const
inherited

returns the domain sizes of the variables in the database

Definition at line 474 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

474  {
475  return scoreDatabase_.domainSizes();
476  }
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ enableEpsilon()

void gum::learning::genericBNLearner::enableEpsilon ( )
inlinevirtualinherited

Enable stopping criterion on epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 934 of file genericBNLearner.h.

934  {
939  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm
void enableEpsilon()
Enable stopping criterion on epsilon.

◆ enableMaxIter()

void gum::learning::genericBNLearner::enableMaxIter ( )
inlinevirtualinherited

Enable stopping criterion on max iterations.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1023 of file genericBNLearner.h.

1023  {
1028  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
void enableMaxIter()
Enable stopping criterion on max iterations.
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ enableMaxTime()

void gum::learning::genericBNLearner::enableMaxTime ( )
inlinevirtualinherited

stopping criterion on timeout If the criterion was disabled it will be enabled

Exceptions
OutOfBoundsif timeout<=0.0 timeout is time in second (double).

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1075 of file genericBNLearner.h.

1075  {
1080  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm
void enableMaxTime()
Enable stopping criterion on timeout.

◆ enableMinEpsilonRate()

void gum::learning::genericBNLearner::enableMinEpsilonRate ( )
inlinevirtualinherited

Enable stopping criterion on epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 979 of file genericBNLearner.h.

979  {
984  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void enableMinEpsilonRate()
Enable stopping criterion on epsilon rate.
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ epsilon()

double gum::learning::genericBNLearner::epsilon ( ) const
inlinevirtualinherited

Get the value of epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 918 of file genericBNLearner.h.

918  {
919  if (currentAlgorithm_ != nullptr)
920  return currentAlgorithm_->epsilon();
921  else
922  GUM_ERROR(FatalError, "No chosen algorithm for learning")
923  }
const ApproximationScheme * currentAlgorithm_
double epsilon() const
Returns the value of epsilon.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ eraseForbiddenArc() [1/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const Arc arc)
inherited

Definition at line 318 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

318  {
320  }
void eraseArc(const Arc &arc)
remove a forbidden arc
StructuralConstraintForbiddenArcs constraintForbiddenArcs_
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ eraseForbiddenArc() [2/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 328 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

328  {
329  eraseForbiddenArc(Arc(tail, head));
330  }
+ Here is the call graph for this function:

◆ eraseForbiddenArc() [3/3]

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 339 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

340  {
341  eraseForbiddenArc(Arc(idFromName(tail), idFromName(head)));
342  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ eraseMandatoryArc() [1/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const Arc arc)
inherited

Definition at line 355 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

355  {
357  }
StructuralConstraintMandatoryArcs constraintMandatoryArcs_
the constraint on mandatory arcs
void eraseArc(const Arc &arc)
remove a forbidden arc
+ Here is the call graph for this function:

◆ eraseMandatoryArc() [2/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 377 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

377  {
378  eraseMandatoryArc(Arc(tail, head));
379  }
+ Here is the call graph for this function:

◆ eraseMandatoryArc() [3/3]

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 366 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

367  {
368  eraseMandatoryArc(Arc(idFromName(tail), idFromName(head)));
369  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ erasePossibleEdge() [1/3]

INLINE void gum::learning::genericBNLearner::erasePossibleEdge ( const Edge edge)
inherited

Definition at line 281 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

281  {
283  }
StructuralConstraintPossibleEdges constraintPossibleEdges_
the constraint on possible Edges
void eraseEdge(const Edge &edge)
remove a forbidden arc
+ Here is the call graph for this function:

◆ erasePossibleEdge() [2/3]

INLINE void gum::learning::genericBNLearner::erasePossibleEdge ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 291 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

291  {
292  erasePossibleEdge(Edge(tail, head));
293  }
void erasePossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ erasePossibleEdge() [3/3]

INLINE void gum::learning::genericBNLearner::erasePossibleEdge ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 302 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

303  {
304  erasePossibleEdge(Edge(idFromName(tail), idFromName(head)));
305  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
void erasePossibleEdge(const Edge &edge)
+ Here is the call graph for this function:

◆ G2() [1/2]

std::pair< double, double > gum::learning::genericBNLearner::G2 ( const NodeId  id1,
const NodeId  id2,
const std::vector< NodeId > &  knowing = {} 
)
inherited

Return the <statistic,pvalue> pair for for G2 test in the database.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 919 of file genericBNLearner.cpp.

921  {
922  createApriori_();
924  return g2score.statistics(id1, id2, knowing);
925  }
the class for computing G2 independence test scores
Definition: indepTestG2.h:47
std::pair< double, double > statistics(NodeId var1, NodeId var2, const std::vector< NodeId, ALLOC< NodeId > > &rhs_ids={})
get the pair <G2statistic,pvalue> for a test var1 indep var2 given rhs_ids
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
void createApriori_()
create the apriori used for learning
DBRowGeneratorParser & parser()
returns the parser for the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
Apriori * apriori_
the apriori used

◆ G2() [2/2]

std::pair< double, double > gum::learning::genericBNLearner::G2 ( const std::string &  name1,
const std::string &  name2,
const std::vector< std::string > &  knowing = {} 
)
inherited

Return the <statistic,pvalue> pair for for G2 test in the database.

Parameters
id1first variable
id2second variable
knowinglist of observed variables
Returns
a std::pair<double,double>

Definition at line 927 of file genericBNLearner.cpp.

929  {
930  std::vector< NodeId > knowingIds;
931  std::transform(knowing.begin(),
932  knowing.end(),
933  std::back_inserter(knowingIds),
934  [this](const std::string& c) -> NodeId { return this->idFromName(c); });
935  return G2(idFromName(name1), idFromName(name2), knowingIds);
936  }
std::pair< double, double > G2(const NodeId id1, const NodeId id2, const std::vector< NodeId > &knowing={})
Return the <statistic,pvalue> pair for for G2 test in the database.
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Size NodeId
Type for node ids.
Definition: graphElements.h:97

◆ getAprioriType_()

INLINE const std::string & gum::learning::genericBNLearner::getAprioriType_ ( ) const
protectedinherited

returns the type (as a string) of a given apriori

Definition at line 447 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

447  {
448  switch (aprioriType_) {
451 
454 
457 
458  case AprioriType::BDEU:
460 
461  default:
462  GUM_ERROR(OperationNotAllowed,
463  "genericBNLearner getAprioriType does "
464  "not support yet this apriori");
465  }
466  }
static const std::string type
Definition: aprioriTypes.h:42
static const std::string type
Definition: aprioriTypes.h:47
static const std::string type
Definition: aprioriTypes.h:52
AprioriType aprioriType_
the a priori selected for the score and parameters
static const std::string type
Definition: aprioriTypes.h:37
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
+ Here is the call graph for this function:

◆ hasMissingValues()

INLINE bool gum::learning::genericBNLearner::hasMissingValues ( ) const
inherited

returns true if the learner's database has missing values

Definition at line 262 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

262  {
264  }
bool hasMissingValues() const
indicates whether the database contains some missing values
const DatabaseTable & databaseTable() const
returns the internal database table
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ history()

const std::vector< double >& gum::learning::genericBNLearner::history ( ) const
inlinevirtualinherited
Exceptions
OperationNotAllowedif scheme not performed or verbosity=false

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1145 of file genericBNLearner.h.

1145  {
1146  if (currentAlgorithm_ != nullptr)
1147  return currentAlgorithm_->history();
1148  else
1149  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1150  }
const ApproximationScheme * currentAlgorithm_
const std::vector< double > & history() const
Returns the scheme history.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ idFromName()

INLINE NodeId gum::learning::genericBNLearner::idFromName ( const std::string &  var_name) const
inherited

returns the node id corresponding to a variable name

Exceptions
MissingVariableInDatabaseif a variable of the BN is not found in the database.

Definition at line 128 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

128  {
129  return scoreDatabase_.idFromName(var_name);
130  }
Database scoreDatabase_
the database to be used by the scores and parameter estimators
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
+ Here is the call graph for this function:

◆ initialDAG()

INLINE DAG gum::learning::genericBNLearner::initialDAG ( )
inherited

returns the initial DAG structure

Definition at line 158 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

158 { return initialDag_; }
DAG initialDag_
an initial DAG given to learners
+ Here is the call graph for this function:

◆ isCSVFileName_()

void gum::learning::genericBNLearner::isCSVFileName_ ( const std::string &  filename)
staticprotectedinherited

checks whether the extension of a CSV filename is correct

Definition at line 393 of file genericBNLearner.cpp.

393  {
394  // get the extension of the file
395  Size filename_size = Size(filename.size());
396 
397  if (filename_size < 4) {
398  GUM_ERROR(FormatNotFound,
399  "genericBNLearner could not determine the "
400  "file type of the database");
401  }
402 
403  std::string extension = filename.substr(filename.size() - 4);
404  std::transform(extension.begin(), extension.end(), extension.begin(), ::tolower);
405 
406  if (extension != ".csv") {
407  GUM_ERROR(OperationNotAllowed,
408  "genericBNLearner does not support yet this type of database file");
409  }
410  }
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:47
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ isEnabledEpsilon()

bool gum::learning::genericBNLearner::isEnabledEpsilon ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on epsilon is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 943 of file genericBNLearner.h.

943  {
944  if (currentAlgorithm_ != nullptr)
946  else
947  GUM_ERROR(FatalError, "No chosen algorithm for learning")
948  }
const ApproximationScheme * currentAlgorithm_
bool isEnabledEpsilon() const
Returns true if stopping criterion on epsilon is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ isEnabledMaxIter()

bool gum::learning::genericBNLearner::isEnabledMaxIter ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on max iterations is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1031 of file genericBNLearner.h.

1031  {
1032  if (currentAlgorithm_ != nullptr)
1034  else
1035  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1036  }
const ApproximationScheme * currentAlgorithm_
bool isEnabledMaxIter() const
Returns true if stopping criterion on max iterations is enabled, false otherwise. ...
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ isEnabledMaxTime()

bool gum::learning::genericBNLearner::isEnabledMaxTime ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on timeout is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1083 of file genericBNLearner.h.

1083  {
1084  if (currentAlgorithm_ != nullptr)
1086  else
1087  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1088  }
const ApproximationScheme * currentAlgorithm_
bool isEnabledMaxTime() const
Returns true if stopping criterion on timeout is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ isEnabledMinEpsilonRate()

bool gum::learning::genericBNLearner::isEnabledMinEpsilonRate ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on epsilon rate is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 987 of file genericBNLearner.h.

987  {
988  if (currentAlgorithm_ != nullptr)
990  else
991  GUM_ERROR(FatalError, "No chosen algorithm for learning")
992  }
const ApproximationScheme * currentAlgorithm_
bool isEnabledMinEpsilonRate() const
Returns true if stopping criterion on epsilon rate is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ latentVariables()

INLINE const std::vector< Arc > gum::learning::genericBNLearner::latentVariables ( ) const
inherited

get the list of arcs hiding latent variables

Exceptions
OperationNotAllowedwhen 3off2 or MIIC is not the selected algorithm

Definition at line 229 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

229  {
231  }
const std::vector< Arc > latentVariables() const
get the list of arcs hiding latent variables
Definition: Miic.cpp:929
Miic algoMiic3off2_
the MIIC or 3off2 algorithm
+ Here is the call graph for this function:

◆ learnBN()

template<typename GUM_SCALAR >
BayesNet< GUM_SCALAR > gum::learning::BNLearner< GUM_SCALAR >::learnBN ( )

learn a Bayes Net from a file (must have read the db before)

◆ learnDAG()

DAG gum::learning::genericBNLearner::learnDAG ( )
inherited

learn a structure from a file (must have read the db before)

Definition at line 625 of file genericBNLearner.cpp.

625  {
626  // create the score and the apriori
627  createApriori_();
628  createScore_();
629 
630  return learnDag_();
631  }
void createScore_()
create the score used for learning
DAG learnDag_()
returns the DAG learnt
void createApriori_()
create the apriori used for learning

◆ learnDag_()

DAG gum::learning::genericBNLearner::learnDag_ ( )
protectedinherited

returns the DAG learnt

Definition at line 660 of file genericBNLearner.cpp.

660  {
661  // check that the database does not contain any missing value
663  || ((aprioriDatabase_ != nullptr)
666  GUM_ERROR(MissingValueInDatabase,
667  "For the moment, the BNLearner is unable to cope "
668  "with missing values in databases");
669  }
670  // add the mandatory arcs to the initial dag and remove the forbidden ones
671  // from the initial graph
672  DAG init_graph = initialDag_;
673 
674  const ArcSet& mandatory_arcs = constraintMandatoryArcs_.arcs();
675 
676  for (const auto& arc: mandatory_arcs) {
677  if (!init_graph.exists(arc.tail())) init_graph.addNodeWithId(arc.tail());
678 
679  if (!init_graph.exists(arc.head())) init_graph.addNodeWithId(arc.head());
680 
681  init_graph.addArc(arc.tail(), arc.head());
682  }
683 
684  const ArcSet& forbidden_arcs = constraintForbiddenArcs_.arcs();
685 
686  for (const auto& arc: forbidden_arcs) {
687  init_graph.eraseArc(arc);
688  }
689 
690  switch (selectedAlgo_) {
691  // ========================================================================
692  case AlgoType::MIIC:
694  BNLearnerListener listener(this, algoMiic3off2_);
695  // create the mixedGraph and the corrected mutual information
696  MixedGraph mgraph = this->prepareMiic3Off2_();
697 
698  return algoMiic3off2_.learnStructure(*mutualInfo_, mgraph);
699  }
700 
701  // ========================================================================
703  BNLearnerListener listener(this, greedyHillClimbing_);
704  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
705  StructuralConstraintForbiddenArcs,
706  StructuralConstraintPossibleEdges,
707  StructuralConstraintSliceOrder >
708  gen_constraint;
709  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint)
711  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint)
713  static_cast< StructuralConstraintPossibleEdges& >(gen_constraint)
715  static_cast< StructuralConstraintSliceOrder& >(gen_constraint) = constraintSliceOrder_;
716 
717  GraphChangesGenerator4DiGraph< decltype(gen_constraint) > op_set(gen_constraint);
718 
719  StructuralConstraintSetStatic< StructuralConstraintIndegree, StructuralConstraintDAG >
720  sel_constraint;
721  static_cast< StructuralConstraintIndegree& >(sel_constraint) = constraintIndegree_;
722 
723  GraphChangesSelector4DiGraph< decltype(sel_constraint), decltype(op_set) > selector(
724  *score_,
725  sel_constraint,
726  op_set);
727 
728  return greedyHillClimbing_.learnStructure(selector, init_graph);
729  }
730 
731  // ========================================================================
733  BNLearnerListener listener(this, localSearchWithTabuList_);
734  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
735  StructuralConstraintForbiddenArcs,
736  StructuralConstraintPossibleEdges,
737  StructuralConstraintSliceOrder >
738  gen_constraint;
739  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint)
741  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint)
743  static_cast< StructuralConstraintPossibleEdges& >(gen_constraint)
745  static_cast< StructuralConstraintSliceOrder& >(gen_constraint) = constraintSliceOrder_;
746 
747  GraphChangesGenerator4DiGraph< decltype(gen_constraint) > op_set(gen_constraint);
748 
749  StructuralConstraintSetStatic< StructuralConstraintTabuList,
750  StructuralConstraintIndegree,
751  StructuralConstraintDAG >
752  sel_constraint;
753  static_cast< StructuralConstraintTabuList& >(sel_constraint) = constraintTabuList_;
754  static_cast< StructuralConstraintIndegree& >(sel_constraint) = constraintIndegree_;
755 
756  GraphChangesSelector4DiGraph< decltype(sel_constraint), decltype(op_set) > selector(
757  *score_,
758  sel_constraint,
759  op_set);
760 
761  return localSearchWithTabuList_.learnStructure(selector, init_graph);
762  }
763 
764  // ========================================================================
765  case AlgoType::K2: {
766  BNLearnerListener listener(this, algoK2_.approximationScheme());
767  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
768  StructuralConstraintForbiddenArcs,
769  StructuralConstraintPossibleEdges >
770  gen_constraint;
771  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint)
773  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint)
775  static_cast< StructuralConstraintPossibleEdges& >(gen_constraint)
777 
778  GraphChangesGenerator4K2< decltype(gen_constraint) > op_set(gen_constraint);
779 
780  // if some mandatory arcs are incompatible with the order, use a DAG
781  // constraint instead of a DiGraph constraint to avoid cycles
782  const ArcSet& mandatory_arcs
783  = static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint).arcs();
784  const Sequence< NodeId >& order = algoK2_.order();
785  bool order_compatible = true;
786 
787  for (const auto& arc: mandatory_arcs) {
788  if (order.pos(arc.tail()) >= order.pos(arc.head())) {
789  order_compatible = false;
790  break;
791  }
792  }
793 
794  if (order_compatible) {
795  StructuralConstraintSetStatic< StructuralConstraintIndegree,
796  StructuralConstraintDiGraph >
797  sel_constraint;
798  static_cast< StructuralConstraintIndegree& >(sel_constraint) = constraintIndegree_;
799 
800  GraphChangesSelector4DiGraph< decltype(sel_constraint), decltype(op_set) > selector(
801  *score_,
802  sel_constraint,
803  op_set);
804 
805  return algoK2_.learnStructure(selector, init_graph);
806  } else {
807  StructuralConstraintSetStatic< StructuralConstraintIndegree, StructuralConstraintDAG >
808  sel_constraint;
809  static_cast< StructuralConstraintIndegree& >(sel_constraint) = constraintIndegree_;
810 
811  GraphChangesSelector4DiGraph< decltype(sel_constraint), decltype(op_set) > selector(
812  *score_,
813  sel_constraint,
814  op_set);
815 
816  return algoK2_.learnStructure(selector, init_graph);
817  }
818  }
819 
820  // ========================================================================
821  default:
822  GUM_ERROR(OperationNotAllowed,
823  "the learnDAG method has not been implemented for this "
824  "learning algorithm");
825  }
826  }
MixedGraph prepareMiic3Off2_()
prepares the initial graph for 3off2 or miic
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
Idx pos(const Key &key) const
Returns the position of the object passed in argument (if it exists).
Definition: sequence_tpl.h:499
const ArcSet & arcs() const
returns the set of mandatory arcs
StructuralConstraintPossibleEdges constraintPossibleEdges_
the constraint on possible Edges
Set< Arc > ArcSet
Some typdefs and define for shortcuts ...
StructuralConstraintMandatoryArcs constraintMandatoryArcs_
the constraint on mandatory arcs
const Sequence< NodeId > & order() const noexcept
returns the current order
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
CorrectedMutualInformation * mutualInfo_
the selected correction for 3off2 and miic
bool hasMissingValues() const
indicates whether the database contains some missing values
DAG initialDag_
an initial DAG given to learners
Miic algoMiic3off2_
the MIIC or 3off2 algorithm
const ArcSet & arcs() const
returns the set of mandatory arcs
AlgoType selectedAlgo_
the selected learning algorithm
Database * aprioriDatabase_
the database used by the Dirichlet a priori
const DatabaseTable & databaseTable() const
returns the internal database table
StructuralConstraintTabuList constraintTabuList_
the constraint for tabu lists
StructuralConstraintIndegree constraintIndegree_
the constraint for indegrees
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
Definition: K2_tpl.h:40
AprioriType aprioriType_
the a priori selected for the score and parameters
StructuralConstraintForbiddenArcs constraintForbiddenArcs_
the constraint on forbidden arcs
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, DAG initial_dag=DAG())
learns the structure of a Bayes net
Database scoreDatabase_
the database to be used by the scores and parameter estimators
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm
StructuralConstraintSliceOrder constraintSliceOrder_
the constraint for 2TBNs
DAG learnStructure(CorrectedMutualInformation<> &I, MixedGraph graph)
learns the structure of a Bayesian network, i.e. a DAG, by first learning an Essential graph and then...
Definition: Miic.cpp:764
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
Score * score_
the score used

◆ learnMixedStructure()

MixedGraph gum::learning::genericBNLearner::learnMixedStructure ( )
inherited

learn a partial structure from a file (must have read the db before and must have selected miic or 3off2)

Definition at line 607 of file genericBNLearner.cpp.

607  {
609  GUM_ERROR(OperationNotAllowed, "Must be using the miic/3off2 algorithm")
610  }
611  // check that the database does not contain any missing value
613  GUM_ERROR(MissingValueInDatabase,
614  "For the moment, the BNLearner is unable to learn "
615  << "structures with missing values in databases");
616  }
617  BNLearnerListener listener(this, algoMiic3off2_);
618 
619  // create the mixedGraph_constraint_MandatoryArcs.arcs();
620  MixedGraph mgraph = this->prepareMiic3Off2_();
621 
623  }
MixedGraph prepareMiic3Off2_()
prepares the initial graph for 3off2 or miic
CorrectedMutualInformation * mutualInfo_
the selected correction for 3off2 and miic
bool hasMissingValues() const
indicates whether the database contains some missing values
Miic algoMiic3off2_
the MIIC or 3off2 algorithm
AlgoType selectedAlgo_
the selected learning algorithm
const DatabaseTable & databaseTable() const
returns the internal database table
MixedGraph learnMixedStructure(CorrectedMutualInformation<> &mutualInformation, MixedGraph graph)
learns the structure of an Essential Graph
Definition: Miic.cpp:106
Database scoreDatabase_
the database to be used by the scores and parameter estimators
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ learnParameters() [1/2]

template<typename GUM_SCALAR >
BayesNet< GUM_SCALAR > gum::learning::BNLearner< GUM_SCALAR >::learnParameters ( const DAG dag,
bool  takeIntoAccountScore = true 
)

learns a BN (its parameters) when its structure is known

Parameters
dagthe structure of the Bayesian network
takeIntoAccountScoreThe dag passed in argument may have been learnt from a structure learning. In this case, if the score used to learn the structure has an implicit apriori (like K2 which has a 1-smoothing apriori), it is important to also take into account this implicit apriori for parameter learning. By default, if a score exists, we will learn parameters by taking into account the apriori specified by methods useAprioriXXX () + the implicit apriori of the score, else we just take into account the apriori specified by useAprioriXXX ()

◆ learnParameters() [2/2]

template<typename GUM_SCALAR >
BayesNet< GUM_SCALAR > gum::learning::BNLearner< GUM_SCALAR >::learnParameters ( bool  take_into_account_score = true)
Parameters
take_into_account_scoreThe dag of the BN which was passed in argument to the BNLearner may have been learnt from a structure learning. In this case, if the score used to learn the structure has an implicit apriori (like K2 which has a 1-smoothing apriori), it is important to also take into account this implicit apriori for parameter learning. By default, if a score exists, we will learn parameters by taking into account the apriori specified by methods useAprioriXXX () + the implicit apriori of the score, else we just take into account the apriori specified by useAprioriXXX ()
Exceptions
MissingVariableInDatabaseif a variable of the BN is not found in the database.
UnknownLabelInDatabaseif a label is found in the databast that do not correpond to the variable.

◆ logLikelihood() [1/2]

double gum::learning::genericBNLearner::logLikelihood ( const std::vector< NodeId > &  vars,
const std::vector< NodeId > &  knowing = {} 
)
inherited

Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.

Parameters
varsa vector of NodeIds
knowingan optional vector of conditioning NodeIds
Returns
a std::pair<double,double>

Definition at line 938 of file genericBNLearner.cpp.

939  {
940  createApriori_();
942  *apriori_,
943  databaseRanges());
944 
945  std::vector< NodeId > total(vars);
946  total.insert(total.end(), knowing.begin(), knowing.end());
947  double LLtotal = ll2score.score(IdCondSet<>(total, false, true));
948  if (knowing.size() == (Size)0) {
949  return LLtotal;
950  } else {
951  double LLknw = ll2score.score(IdCondSet<>(knowing, false, true));
952  return LLtotal - LLknw;
953  }
954  }
the class for computing Log2-likelihood scores
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
void createApriori_()
create the apriori used for learning
DBRowGeneratorParser & parser()
returns the parser for the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
Apriori * apriori_
the apriori used
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:47

◆ logLikelihood() [2/2]

double gum::learning::genericBNLearner::logLikelihood ( const std::vector< std::string > &  vars,
const std::vector< std::string > &  knowing = {} 
)
inherited

Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.

Parameters
varsa vector of name of rows
knowingan optional vector of conditioning rows
Returns
a std::pair<double,double>

Definition at line 956 of file genericBNLearner.cpp.

957  {
958  std::vector< NodeId > ids;
959  std::vector< NodeId > knowingIds;
960 
961  auto mapper = [this](const std::string& c) -> NodeId {
962  return this->idFromName(c);
963  };
964 
965  std::transform(vars.begin(), vars.end(), std::back_inserter(ids), mapper);
966  std::transform(knowing.begin(), knowing.end(), std::back_inserter(knowingIds), mapper);
967 
968  return logLikelihood(ids, knowingIds);
969  }
double logLikelihood(const std::vector< NodeId > &vars, const std::vector< NodeId > &knowing={})
Return the loglikelihood of vars in the base, conditioned by knowing for the BNLearner.
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
Size NodeId
Type for node ids.
Definition: graphElements.h:97

◆ maxIter()

Size gum::learning::genericBNLearner::maxIter ( ) const
inlinevirtualinherited
Returns
the criterion on number of iterations

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1008 of file genericBNLearner.h.

1008  {
1009  if (currentAlgorithm_ != nullptr)
1010  return currentAlgorithm_->maxIter();
1011  else
1012  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1013  }
const ApproximationScheme * currentAlgorithm_
Size maxIter() const
Returns the criterion on number of iterations.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ maxTime()

double gum::learning::genericBNLearner::maxTime ( ) const
inlinevirtualinherited

returns the timeout (in seconds)

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1053 of file genericBNLearner.h.

1053  {
1054  if (currentAlgorithm_ != nullptr)
1055  return currentAlgorithm_->maxTime();
1056  else
1057  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1058  }
double maxTime() const
Returns the timeout (in seconds).
const ApproximationScheme * currentAlgorithm_
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ messageApproximationScheme()

INLINE std::string gum::IApproximationSchemeConfiguration::messageApproximationScheme ( ) const
inherited

Returns the approximation scheme message.

Returns
Returns the approximation scheme message.

Definition at line 38 of file IApproximationSchemeConfiguration_inl.h.

References gum::Set< Key, Alloc >::emplace().

38  {
39  std::stringstream s;
40 
41  switch (stateApproximationScheme()) {
43  s << "in progress";
44  break;
45 
47  s << "stopped with epsilon=" << epsilon();
48  break;
49 
51  s << "stopped with rate=" << minEpsilonRate();
52  break;
53 
55  s << "stopped with max iteration=" << maxIter();
56  break;
57 
59  s << "stopped with timeout=" << maxTime();
60  break;
61 
63  s << "stopped on request";
64  break;
65 
67  s << "undefined state";
68  break;
69  };
70 
71  return s.str();
72  }
virtual double epsilon() const =0
Returns the value of epsilon.
virtual ApproximationSchemeSTATE stateApproximationScheme() const =0
Returns the approximation scheme state.
virtual double maxTime() const =0
Returns the timeout (in seconds).
virtual Size maxIter() const =0
Returns the criterion on number of iterations.
virtual double minEpsilonRate() const =0
Returns the value of the minimal epsilon rate.
+ Here is the call graph for this function:

◆ minEpsilonRate()

double gum::learning::genericBNLearner::minEpsilonRate ( ) const
inlinevirtualinherited

Get the value of the minimal epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 964 of file genericBNLearner.h.

964  {
965  if (currentAlgorithm_ != nullptr)
967  else
968  GUM_ERROR(FatalError, "No chosen algorithm for learning")
969  }
double minEpsilonRate() const
Returns the value of the minimal epsilon rate.
const ApproximationScheme * currentAlgorithm_
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ nameFromId()

INLINE const std::string & gum::learning::genericBNLearner::nameFromId ( NodeId  id) const
inherited

returns the variable name corresponding to a given node id

Definition at line 133 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

133  {
134  return scoreDatabase_.nameFromId(id);
135  }
const std::string & nameFromId(NodeId id) const
returns the variable name corresponding to a given node id
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ names()

INLINE const std::vector< std::string > & gum::learning::genericBNLearner::names ( ) const
inherited

returns the names of the variables in the database

Definition at line 469 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

469  {
470  return scoreDatabase_.names();
471  }
Database scoreDatabase_
the database to be used by the scores and parameter estimators
const std::vector< std::string > & names() const
returns the names of the variables in the database
+ Here is the call graph for this function:

◆ nbCols()

INLINE Size gum::learning::genericBNLearner::nbCols ( ) const
inherited
Returns
the number of cols in the database

Definition at line 501 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

501 { return scoreDatabase_.domainSizes().size(); }
const std::vector< std::size_t > & domainSizes() const
returns the domain sizes of the variables
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ nbrIterations()

Size gum::learning::genericBNLearner::nbrIterations ( ) const
inlinevirtualinherited
Exceptions
OperationNotAllowedif scheme not performed

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1137 of file genericBNLearner.h.

1137  {
1138  if (currentAlgorithm_ != nullptr)
1139  return currentAlgorithm_->nbrIterations();
1140  else
1141  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1142  }
Size nbrIterations() const
Returns the number of iterations.
const ApproximationScheme * currentAlgorithm_
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ nbRows()

INLINE Size gum::learning::genericBNLearner::nbRows ( ) const
inherited
Returns
the number of rows in the database

Definition at line 503 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

503 { return scoreDatabase_.databaseTable().size(); }
std::size_t size() const noexcept
returns the number of records (rows) in the database
const DatabaseTable & databaseTable() const
returns the internal database table
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ operator=() [1/2]

template<typename GUM_SCALAR >
BNLearner& gum::learning::BNLearner< GUM_SCALAR >::operator= ( const BNLearner< GUM_SCALAR > &  )

copy operator

◆ operator=() [2/2]

template<typename GUM_SCALAR >
BNLearner& gum::learning::BNLearner< GUM_SCALAR >::operator= ( BNLearner< GUM_SCALAR > &&  )

move operator

◆ periodSize()

Size gum::learning::genericBNLearner::periodSize ( ) const
inlinevirtualinherited

how many samples between 2 stopping isEnableds

Exceptions
OutOfBoundsif p<1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1101 of file genericBNLearner.h.

1101  {
1102  if (currentAlgorithm_ != nullptr)
1103  return currentAlgorithm_->periodSize();
1104  else
1105  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1106  }
Size periodSize() const
Returns the period size.
const ApproximationScheme * currentAlgorithm_
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ prepareMiic3Off2_()

MixedGraph gum::learning::genericBNLearner::prepareMiic3Off2_ ( )
protectedinherited

prepares the initial graph for 3off2 or miic

Definition at line 577 of file genericBNLearner.cpp.

577  {
578  // Initialize the mixed graph to the fully connected graph
579  MixedGraph mgraph;
580  for (Size i = 0; i < scoreDatabase_.databaseTable().nbVariables(); ++i) {
581  mgraph.addNodeWithId(i);
582  for (Size j = 0; j < i; ++j) {
583  mgraph.addEdge(j, i);
584  }
585  }
586 
587  // translating the constraints for 3off2 or miic
588  HashTable< std::pair< NodeId, NodeId >, char > initial_marks;
589  const ArcSet& mandatory_arcs = constraintMandatoryArcs_.arcs();
590  for (const auto& arc: mandatory_arcs) {
591  initial_marks.insert({arc.tail(), arc.head()}, '>');
592  }
593 
594  const ArcSet& forbidden_arcs = constraintForbiddenArcs_.arcs();
595  for (const auto& arc: forbidden_arcs) {
596  initial_marks.insert({arc.tail(), arc.head()}, '-');
597  }
598  algoMiic3off2_.addConstraints(initial_marks);
599 
600  // create the mutual entropy object
601  // if ( _mutual_info_ == nullptr) { this->useNMLCorrection(); }
603 
604  return mgraph;
605  }
const ArcSet & arcs() const
returns the set of mandatory arcs
void addConstraints(HashTable< std::pair< NodeId, NodeId >, char > constraints)
Set a ensemble of constraints for the orientation phase.
Definition: Miic.cpp:944
Set< Arc > ArcSet
Some typdefs and define for shortcuts ...
StructuralConstraintMandatoryArcs constraintMandatoryArcs_
the constraint on mandatory arcs
std::size_t nbVariables() const noexcept
returns the number of variables (columns) of the database
Miic algoMiic3off2_
the MIIC or 3off2 algorithm
const ArcSet & arcs() const
returns the set of mandatory arcs
const DatabaseTable & databaseTable() const
returns the internal database table
StructuralConstraintForbiddenArcs constraintForbiddenArcs_
the constraint on forbidden arcs
Database scoreDatabase_
the database to be used by the scores and parameter estimators
std::size_t Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:47
void createCorrectedMutualInformation_()
create the Corrected Mutual Information instance for Miic/3off2
void insert(const Key &k)
Inserts a new element into the set.
Definition: set_tpl.h:606

◆ rawPseudoCount() [1/2]

std::vector< double > gum::learning::genericBNLearner::rawPseudoCount ( const std::vector< NodeId > &  vars)
inherited

Return the pseudoconts ofNodeIds vars in the base in a raw array.

Parameters
varsa vector of
Returns
a a std::vector<double> containing the contingency table

Definition at line 971 of file genericBNLearner.cpp.

971  {
972  Potential< double > res;
973 
974  createApriori_();
976  return count.get(vars);
977  }
The class for giving access to pseudo count : count in the database + prior.
Definition: pseudoCount.h:52
std::vector< double, ALLOC< double > > get(const std::vector< NodeId, ALLOC< NodeId > > &ids)
returns the pseudo-count of a pair of nodes given some other nodes
const std::vector< std::pair< std::size_t, std::size_t > > & databaseRanges() const
returns the current database rows&#39; ranges used for learning
void createApriori_()
create the apriori used for learning
DBRowGeneratorParser & parser()
returns the parser for the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
Apriori * apriori_
the apriori used

◆ rawPseudoCount() [2/2]

std::vector< double > gum::learning::genericBNLearner::rawPseudoCount ( const std::vector< std::string > &  vars)
inherited

Return the pseudoconts of vars in the base in a raw array.

Parameters
varsa vector of name
Returns
a std::vector<double> containing the contingency table

Definition at line 980 of file genericBNLearner.cpp.

980  {
981  std::vector< NodeId > ids;
982 
983  auto mapper = [this](const std::string& c) -> NodeId {
984  return this->idFromName(c);
985  };
986 
987  std::transform(vars.begin(), vars.end(), std::back_inserter(ids), mapper);
988 
989  return rawPseudoCount(ids);
990  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
std::vector< double > rawPseudoCount(const std::vector< NodeId > &vars)
Return the pseudoconts ofNodeIds vars in the base in a raw array.
Size NodeId
Type for node ids.
Definition: graphElements.h:97

◆ readFile_()

DatabaseTable gum::learning::genericBNLearner::readFile_ ( const std::string &  filename,
const std::vector< std::string > &  missing_symbols 
)
staticprotectedinherited

reads a file and returns a databaseVectInRam

Definition at line 413 of file genericBNLearner.cpp.

414  {
415  // get the extension of the file
416  isCSVFileName_(filename);
417 
418  DBInitializerFromCSV<> initializer(filename);
419 
420  const auto& var_names = initializer.variableNames();
421  const std::size_t nb_vars = var_names.size();
422 
423  DBTranslatorSet<> translator_set;
424  DBTranslator4LabelizedVariable<> translator(missing_symbols);
425  for (std::size_t i = 0; i < nb_vars; ++i) {
426  translator_set.insertTranslator(translator, i);
427  }
428 
429  DatabaseTable<> database(missing_symbols, translator_set);
430  database.setVariableNames(initializer.variableNames());
431  initializer.fillDatabase(database);
432 
433  database.reorder();
434 
435  return database;
436  }
static void isCSVFileName_(const std::string &filename)
checks whether the extension of a CSV filename is correct
virtual void setVariableNames(const std::vector< std::string, ALLOC< std::string > > &names, const bool from_external_object=true) final
sets the names of the variables
const DatabaseTable & database() const
returns the database used by the BNLearner
void reorder(const std::size_t k, const bool k_is_input_col=false)
performs a reordering of the kth translator or of the first translator parsing the kth column of the ...

◆ recordWeight()

INLINE double gum::learning::genericBNLearner::recordWeight ( const std::size_t  i) const
inherited

returns the weight of the ith record

Exceptions
OutOfBoundsif i is outside the set of indices of the records

Definition at line 148 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

148  {
149  return scoreDatabase_.weight(i);
150  }
double weight(const std::size_t i) const
returns the weight of the ith record
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ setCurrentApproximationScheme()

INLINE void gum::learning::genericBNLearner::setCurrentApproximationScheme ( const ApproximationScheme approximationScheme)
inlineinherited

{@ /// distribute signals

Definition at line 884 of file genericBNLearner.h.

884  {
885  currentAlgorithm_ = approximationScheme;
886  }
const ApproximationScheme * currentAlgorithm_

◆ setDatabaseWeight()

INLINE void gum::learning::genericBNLearner::setDatabaseWeight ( const double  new_weight)
inherited

assign a weight to all the rows of the learning database so that the sum of their weights is equal to new_weight

assign new weight to the rows of the learning database

Definition at line 138 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

138  {
139  scoreDatabase_.setDatabaseWeight(new_weight);
140  }
void setDatabaseWeight(const double new_weight)
assign a weight to all the rows of the database so that the sum of their weights is equal to new_weig...
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ setEpsilon()

void gum::learning::genericBNLearner::setEpsilon ( double  eps)
inlinevirtualinherited

Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)| If the criterion was disabled it will be enabled.

Exceptions
OutOfBoundsif eps<0

Implements gum::IApproximationSchemeConfiguration.

Definition at line 910 of file genericBNLearner.h.

910  {
914  Dag2BN_.setEpsilon(eps);
915  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
void setEpsilon(double eps)
Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)|.
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ setForbiddenArcs()

INLINE void gum::learning::genericBNLearner::setForbiddenArcs ( const ArcSet set)
inherited

assign a set of forbidden arcs

Definition at line 308 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

308  {
310  }
void setArcs(const ArcSet &set)
assign a set of forbidden arcs
StructuralConstraintForbiddenArcs constraintForbiddenArcs_
the constraint on forbidden arcs
+ Here is the call graph for this function:

◆ setInitialDAG()

INLINE void gum::learning::genericBNLearner::setInitialDAG ( const DAG dag)
inherited

sets an initial DAG structure

Definition at line 156 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

156 { initialDag_ = dag; }
DAG initialDag_
an initial DAG given to learners
+ Here is the call graph for this function:

◆ setMandatoryArcs()

INLINE void gum::learning::genericBNLearner::setMandatoryArcs ( const ArcSet set)
inherited

assign a set of forbidden arcs

Definition at line 345 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

345  {
347  }
void setArcs(const ArcSet &set)
assign a set of forbidden arcs
StructuralConstraintMandatoryArcs constraintMandatoryArcs_
the constraint on mandatory arcs
+ Here is the call graph for this function:

◆ setMaxIndegree()

INLINE void gum::learning::genericBNLearner::setMaxIndegree ( Size  max_indegree)
inherited

sets the max indegree

Definition at line 197 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

197  {
198  constraintIndegree_.setMaxIndegree(max_indegree);
199  }
void setMaxIndegree(Size max_indegree, bool update_all_node=false)
resets the default max indegree and possibly updates the indegree of all nodes
StructuralConstraintIndegree constraintIndegree_
the constraint for indegrees
+ Here is the call graph for this function:

◆ setMaxIter()

void gum::learning::genericBNLearner::setMaxIter ( Size  max)
inlinevirtualinherited

stopping criterion on number of iterationsIf the criterion was disabled it will be enabled

Parameters
maxThe maximum number of iterations
Exceptions
OutOfBoundsif max<=1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1000 of file genericBNLearner.h.

1000  {
1004  Dag2BN_.setMaxIter(max);
1005  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
void setMaxIter(Size max)
Stopping criterion on number of iterations.
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ setMaxTime()

void gum::learning::genericBNLearner::setMaxTime ( double  timeout)
inlinevirtualinherited

stopping criterion on timeout If the criterion was disabled it will be enabled

Exceptions
OutOfBoundsif timeout<=0.0 timeout is time in second (double).

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1045 of file genericBNLearner.h.

1045  {
1049  Dag2BN_.setMaxTime(timeout);
1050  }
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setMaxTime(double timeout)
Stopping criterion on timeout.
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ setMinEpsilonRate()

void gum::learning::genericBNLearner::setMinEpsilonRate ( double  rate)
inlinevirtualinherited

Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|) If the criterion was disabled it will be enabled.

Exceptions
OutOfBoundsif rate<0

Implements gum::IApproximationSchemeConfiguration.

Definition at line 956 of file genericBNLearner.h.

956  {
961  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setMinEpsilonRate(double rate)
Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|).
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ setPeriodSize()

void gum::learning::genericBNLearner::setPeriodSize ( Size  p)
inlinevirtualinherited

how many samples between 2 stopping isEnableds

Exceptions
OutOfBoundsif p<1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1094 of file genericBNLearner.h.

1094  {
1099  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setPeriodSize(Size p)
How many samples between two stopping is enable.
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ setPossibleEdges()

INLINE void gum::learning::genericBNLearner::setPossibleEdges ( const EdgeSet set)
inherited

assign a set of forbidden edges

Warning
Once at least one possible edge is defined, all other edges are not possible anymore

Definition at line 267 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

267  {
269  }
void setEdges(const EdgeSet &set)
assign a set of forbidden arcs
StructuralConstraintPossibleEdges constraintPossibleEdges_
the constraint on possible Edges
+ Here is the call graph for this function:

◆ setPossibleSkeleton()

INLINE void gum::learning::genericBNLearner::setPossibleSkeleton ( const UndiGraph skeleton)
inherited

assign a set of forbidden edges

Warning
Once at least one possible edge is defined, all other edges are not possible anymore

Definition at line 271 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

271  {
272  setPossibleEdges(g.edges());
273  }
void setPossibleEdges(const EdgeSet &set)
assign a set of forbidden edges
+ Here is the call graph for this function:

◆ setRecordWeight()

INLINE void gum::learning::genericBNLearner::setRecordWeight ( const std::size_t  i,
const double  weight 
)
inherited

sets the weight of the ith record of the database

assign new weight to the ith row of the learning database

Exceptions
OutOfBoundsif i is outside the set of indices of the records or if the weight is negative

Definition at line 143 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

143  {
144  scoreDatabase_.setWeight(i, new_weight);
145  }
void setWeight(const std::size_t i, const double weight)
sets the weight of the ith record
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ setSliceOrder() [1/2]

INLINE void gum::learning::genericBNLearner::setSliceOrder ( const NodeProperty< NodeId > &  slice_order)
inherited

sets a partial order on the nodes

Parameters
slice_ordera NodeProperty given the rank (priority) of nodes in the partial order

Definition at line 382 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

382  {
383  constraintSliceOrder_ = StructuralConstraintSliceOrder(slice_order);
384  }
StructuralConstraintSliceOrder constraintSliceOrder_
the constraint for 2TBNs
+ Here is the call graph for this function:

◆ setSliceOrder() [2/2]

INLINE void gum::learning::genericBNLearner::setSliceOrder ( const std::vector< std::vector< std::string > > &  slices)
inherited

sets a partial order on the nodes

Parameters
slicesthe list of list of variable names

Definition at line 387 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

387  {
388  NodeProperty< NodeId > slice_order;
389  NodeId rank = 0;
390  for (const auto& slice: slices) {
391  for (const auto& name: slice) {
392  slice_order.insert(idFromName(name), rank);
393  }
394  rank++;
395  }
396  setSliceOrder(slice_order);
397  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name
void setSliceOrder(const NodeProperty< NodeId > &slice_order)
sets a partial order on the nodes
Size NodeId
Type for node ids.
Definition: graphElements.h:97
+ Here is the call graph for this function:

◆ setVerbosity()

void gum::learning::genericBNLearner::setVerbosity ( bool  v)
inlinevirtualinherited

verbosity

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1111 of file genericBNLearner.h.

1111  {
1115  Dag2BN_.setVerbosity(v);
1116  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setVerbosity(bool v)
Set the verbosity on (true) or off (false).
GreedyHillClimbing greedyHillClimbing_
the greedy hill climbing algorithm
DAG2BNLearner Dag2BN_
the parametric EM
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm

◆ state()

template<typename GUM_SCALAR >
std::vector< std::tuple< std::string, std::string, std::string > > gum::learning::BNLearner< GUM_SCALAR >::state ( ) const
Returns
a representation of the state of the learner in the form vector<key,value,comment>

◆ stateApproximationScheme()

ApproximationSchemeSTATE gum::learning::genericBNLearner::stateApproximationScheme ( ) const
inlinevirtualinherited

history

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1129 of file genericBNLearner.h.

1129  {
1130  if (currentAlgorithm_ != nullptr)
1132  else
1133  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1134  }
const ApproximationScheme * currentAlgorithm_
ApproximationSchemeSTATE stateApproximationScheme() const
Returns the approximation scheme state.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ toString()

template<typename GUM_SCALAR >
std::string gum::learning::BNLearner< GUM_SCALAR >::toString ( ) const
Returns
Returns a string representation of this BNLearner's current features.

◆ use3off2()

INLINE void gum::learning::genericBNLearner::use3off2 ( )
inherited

indicate that we wish to use 3off2

Definition at line 202 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

202  {
205  }
void set3of2Behaviour()
Sets the orientation phase to follow the one of the 3off2 algorithm.
Definition: Miic.cpp:942
Miic algoMiic3off2_
the MIIC or 3off2 algorithm
AlgoType selectedAlgo_
the selected learning algorithm
+ Here is the call graph for this function:

◆ useAprioriBDeu()

INLINE void gum::learning::genericBNLearner::useAprioriBDeu ( double  weight = 1)
inherited

use the BDeu apriori

The BDeu apriori adds weight to all the cells of the countings tables. In other words, it adds weight rows in the database with equally probable values.

Definition at line 436 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

436  {
437  if (weight < 0) { GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive") }
438 
440  _setAprioriWeight_(weight);
441 
443  }
void _setAprioriWeight_(double weight)
sets the apriori weight
AprioriType aprioriType_
the a priori selected for the score and parameters
std::string checkScoreAprioriCompatibility() const
checks whether the current score and apriori are compatible
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
+ Here is the call graph for this function:

◆ useAprioriDirichlet()

INLINE void gum::learning::genericBNLearner::useAprioriDirichlet ( const std::string &  filename,
double  weight = 1 
)
inherited

use the Dirichlet apriori

Definition at line 424 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

424  {
425  if (weight < 0) { GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive") }
426 
427  aprioriDbname_ = filename;
429  _setAprioriWeight_(weight);
430 
432  }
std::string aprioriDbname_
the filename for the Dirichlet a priori, if any
void _setAprioriWeight_(double weight)
sets the apriori weight
AprioriType aprioriType_
the a priori selected for the score and parameters
std::string checkScoreAprioriCompatibility() const
checks whether the current score and apriori are compatible
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
+ Here is the call graph for this function:

◆ useAprioriSmoothing()

INLINE void gum::learning::genericBNLearner::useAprioriSmoothing ( double  weight = 1)
inherited

use the apriori smoothing

Parameters
weightpass in argument a weight if you wish to assign a weight to the smoothing, else the current weight of the genericBNLearner will be used.

Definition at line 414 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

414  {
415  if (weight < 0) { GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive") }
416 
418  _setAprioriWeight_(weight);
419 
421  }
void _setAprioriWeight_(double weight)
sets the apriori weight
AprioriType aprioriType_
the a priori selected for the score and parameters
std::string checkScoreAprioriCompatibility() const
checks whether the current score and apriori are compatible
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51
+ Here is the call graph for this function:

◆ useCrossValidationFold()

std::pair< std::size_t, std::size_t > gum::learning::genericBNLearner::useCrossValidationFold ( const std::size_t  learning_fold,
const std::size_t  k_fold 
)
inherited

sets the ranges of rows to be used for cross-validation learning

When applied on (x,k), the method indicates to the subsequent learnings that they should be performed on the xth fold in a k-fold cross-validation context. For instance, if a database has 1000 rows, and if we perform a 10-fold cross-validation, then, the first learning fold (learning_fold=0) corresponds to rows interval [100,1000) and the test dataset corresponds to [0,100). The second learning fold (learning_fold=1) is [0,100) U [200,1000) and the corresponding test dataset is [100,200).

Parameters
learning_folda number indicating the set of rows used for learning. If N denotes the size of the database, and k_fold represents the number of folds in the cross validation, then the set of rows used for testing is [learning_fold * N / k_fold, (learning_fold+1) * N / k_fold) and the learning database is the complement in the database
k_foldthe value of "k" in k-fold cross validation
Returns
a pair [x,y) of rows' indices that corresponds to the indices of rows in the original database that constitute the test dataset
Exceptions
OutOfBoundsis raised if k_fold is equal to 0 or learning_fold is greater than or eqal to k_fold, or if k_fold is greater than or equal to the size of the database.

Definition at line 858 of file genericBNLearner.cpp.

859  {
860  if (k_fold == 0) { GUM_ERROR(OutOfBounds, "K-fold cross validation with k=0 is forbidden") }
861 
862  if (learning_fold >= k_fold) {
863  GUM_ERROR(OutOfBounds,
864  "In " << k_fold << "-fold cross validation, the learning "
865  << "fold should be strictly lower than " << k_fold
866  << " but, here, it is equal to " << learning_fold);
867  }
868 
869  const std::size_t db_size = scoreDatabase_.databaseTable().nbRows();
870  if (k_fold >= db_size) {
871  GUM_ERROR(OutOfBounds,
872  "In " << k_fold << "-fold cross validation, the database's "
873  << "size should be strictly greater than " << k_fold
874  << " but, here, the database has only " << db_size << "rows");
875  }
876 
877  // create the ranges of rows of the test database
878  const std::size_t foldSize = db_size / k_fold;
879  const std::size_t unfold_deb = learning_fold * foldSize;
880  const std::size_t unfold_end = unfold_deb + foldSize;
881 
882  ranges_.clear();
883  if (learning_fold == std::size_t(0)) {
884  ranges_.push_back(std::pair< std::size_t, std::size_t >(unfold_end, db_size));
885  } else {
886  ranges_.push_back(std::pair< std::size_t, std::size_t >(std::size_t(0), unfold_deb));
887 
888  if (learning_fold != k_fold - 1) {
889  ranges_.push_back(std::pair< std::size_t, std::size_t >(unfold_end, db_size));
890  }
891  }
892 
893  return std::pair< std::size_t, std::size_t >(unfold_deb, unfold_end);
894  }
std::vector< std::pair< std::size_t, std::size_t > > ranges_
the set of rows&#39; ranges within the database in which learning is done
std::size_t nbRows() const noexcept
returns the number of records (rows) in the database
const DatabaseTable & databaseTable() const
returns the internal database table
Database scoreDatabase_
the database to be used by the scores and parameter estimators
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

◆ useDatabaseRanges()

template<template< typename > class XALLOC>
void gum::learning::genericBNLearner::useDatabaseRanges ( const std::vector< std::pair< std::size_t, std::size_t >, XALLOC< std::pair< std::size_t, std::size_t > > > &  new_ranges)
inherited

use a new set of database rows' ranges to perform learning

Parameters
rangesa set of pairs {(X1,Y1),...,(Xn,Yn)} of database's rows indices. The subsequent learnings are then performed only on the union of the rows [Xi,Yi), i in {1,...,n}. This is useful, e.g, when performing cross validation tasks, in which part of the database should be ignored. An empty set of ranges is equivalent to an interval [X,Y) ranging over the whole database.

Definition at line 98 of file genericBNLearner_tpl.h.

References gum::learning::genericBNLearner::Database::Database().

100  {
101  // use a score to detect whether the ranges are ok
102  ScoreLog2Likelihood<> score(scoreDatabase_.parser(), *noApriori_);
103  score.setRanges(new_ranges);
104  ranges_ = score.ranges();
105  }
std::vector< std::pair< std::size_t, std::size_t > > ranges_
the set of rows&#39; ranges within the database in which learning is done
DBRowGeneratorParser & parser()
returns the parser for the database
Database scoreDatabase_
the database to be used by the scores and parameter estimators
+ Here is the call graph for this function:

◆ useEM()

INLINE void gum::learning::genericBNLearner::useEM ( const double  epsilon)
inherited

use The EM algorithm to learn paramters

if epsilon=0, EM is not used

Definition at line 259 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

259 { epsilonEM_ = epsilon; }
double epsilon() const
Get the value of epsilon.
double epsilonEM_
epsilon for EM. if espilon=0.0 : no EM
+ Here is the call graph for this function:

◆ useGreedyHillClimbing()

INLINE void gum::learning::genericBNLearner::useGreedyHillClimbing ( )
inherited

indicate that we wish to use a greedy hill climbing algorithm

Definition at line 246 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

+ Here is the call graph for this function:

◆ useK2() [1/2]

INLINE void gum::learning::genericBNLearner::useK2 ( const Sequence< NodeId > &  order)
inherited

indicate that we wish to use K2

Definition at line 234 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

234  {
236  algoK2_.setOrder(order);
237  }
void setOrder(const Sequence< NodeId > &order)
sets the order on the variables
AlgoType selectedAlgo_
the selected learning algorithm
+ Here is the call graph for this function:

◆ useK2() [2/2]

INLINE void gum::learning::genericBNLearner::useK2 ( const std::vector< NodeId > &  order)
inherited

indicate that we wish to use K2

Definition at line 240 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

240  {
242  algoK2_.setOrder(order);
243  }
void setOrder(const Sequence< NodeId > &order)
sets the order on the variables
AlgoType selectedAlgo_
the selected learning algorithm
+ Here is the call graph for this function:

◆ useLocalSearchWithTabuList()

INLINE void gum::learning::genericBNLearner::useLocalSearchWithTabuList ( Size  tabu_size = 100,
Size  nb_decrease = 2 
)
inherited

indicate that we wish to use a local search with tabu list

Parameters
tabu_sizeindicate the size of the tabu list
nb_decreaseindicate the max number of changes decreasing the score consecutively that we allow to apply

Definition at line 251 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

251  {
253  nbDecreasingChanges_ = nb_decrease;
256  }
void setTabuListSize(Size new_size)
sets the size of the tabu list
AlgoType selectedAlgo_
the selected learning algorithm
StructuralConstraintTabuList constraintTabuList_
the constraint for tabu lists
LocalSearchWithTabuList localSearchWithTabuList_
the local search with tabu list algorithm
void setMaxNbDecreasingChanges(Size nb)
set the max number of changes decreasing the score that we allow to apply
+ Here is the call graph for this function:

◆ useMDLCorrection()

INLINE void gum::learning::genericBNLearner::useMDLCorrection ( )
inherited

indicate that we wish to use the MDL correction for 3off2 and MIIC

indicate that we wish to use the MDL correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 219 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

+ Here is the call graph for this function:

◆ useMIIC()

INLINE void gum::learning::genericBNLearner::useMIIC ( )
inherited

indicate that we wish to use MIIC

Definition at line 208 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

208  {
211  }
void setMiicBehaviour()
Sets the orientation phase to follow the one of the MIIC algorithm.
Definition: Miic.cpp:940
Miic algoMiic3off2_
the MIIC or 3off2 algorithm
AlgoType selectedAlgo_
the selected learning algorithm
+ Here is the call graph for this function:

◆ useNMLCorrection()

INLINE void gum::learning::genericBNLearner::useNMLCorrection ( )
inherited

indicate that we wish to use the NML correction for 3off2 and MIIC

indicate that we wish to use the NML correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 214 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

+ Here is the call graph for this function:

◆ useNoApriori()

INLINE void gum::learning::genericBNLearner::useNoApriori ( )
inherited

use no apriori

Definition at line 408 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

408  {
411  }
AprioriType aprioriType_
the a priori selected for the score and parameters
std::string checkScoreAprioriCompatibility() const
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useNoCorrection()

INLINE void gum::learning::genericBNLearner::useNoCorrection ( )
inherited

indicate that we wish to use the NoCorr correction for 3off2 and MIIC

indicate that we wish to use the NoCorr correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 224 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

+ Here is the call graph for this function:

◆ useScoreAIC()

INLINE void gum::learning::genericBNLearner::useScoreAIC ( )
inherited

indicate that we wish to use an AIC score

Definition at line 161 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

161  {
164  }
ScoreType scoreType_
the score selected for learning
std::string checkScoreAprioriCompatibility() const
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreBD()

INLINE void gum::learning::genericBNLearner::useScoreBD ( )
inherited

indicate that we wish to use a BD score

Definition at line 167 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

167  {
170  }
ScoreType scoreType_
the score selected for learning
std::string checkScoreAprioriCompatibility() const
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreBDeu()

INLINE void gum::learning::genericBNLearner::useScoreBDeu ( )
inherited

indicate that we wish to use a BDeu score

Definition at line 173 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

173  {
176  }
ScoreType scoreType_
the score selected for learning
std::string checkScoreAprioriCompatibility() const
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreBIC()

INLINE void gum::learning::genericBNLearner::useScoreBIC ( )
inherited

indicate that we wish to use a BIC score

Definition at line 179 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

179  {
182  }
ScoreType scoreType_
the score selected for learning
std::string checkScoreAprioriCompatibility() const
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreK2()

INLINE void gum::learning::genericBNLearner::useScoreK2 ( )
inherited

indicate that we wish to use a K2 score

Definition at line 185 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

185  {
188  }
ScoreType scoreType_
the score selected for learning
std::string checkScoreAprioriCompatibility() const
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ useScoreLog2Likelihood()

INLINE void gum::learning::genericBNLearner::useScoreLog2Likelihood ( )
inherited

indicate that we wish to use a Log2Likelihood score

Definition at line 191 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::Database::Database().

191  {
194  }
ScoreType scoreType_
the score selected for learning
std::string checkScoreAprioriCompatibility() const
checks whether the current score and apriori are compatible
+ Here is the call graph for this function:

◆ verbosity()

bool gum::learning::genericBNLearner::verbosity ( ) const
inlinevirtualinherited

verbosity

Implements gum::IApproximationSchemeConfiguration.

Definition at line 1118 of file genericBNLearner.h.

1118  {
1119  if (currentAlgorithm_ != nullptr)
1120  return currentAlgorithm_->verbosity();
1121  else
1122  GUM_ERROR(FatalError, "No chosen algorithm for learning")
1123  }
const ApproximationScheme * currentAlgorithm_
bool verbosity() const
Returns true if verbosity is enabled.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:51

Member Data Documentation

◆ algoK2_

K2 gum::learning::genericBNLearner::algoK2_
protectedinherited

the K2 algorithm

Definition at line 801 of file genericBNLearner.h.

◆ algoMiic3off2_

Miic gum::learning::genericBNLearner::algoMiic3off2_
protectedinherited

the MIIC or 3off2 algorithm

Definition at line 804 of file genericBNLearner.h.

◆ apriori_

Apriori* gum::learning::genericBNLearner::apriori_ {nullptr}
protectedinherited

the apriori used

Definition at line 772 of file genericBNLearner.h.

◆ aprioriDatabase_

Database* gum::learning::genericBNLearner::aprioriDatabase_ {nullptr}
protectedinherited

the database used by the Dirichlet a priori

Definition at line 826 of file genericBNLearner.h.

◆ aprioriDbname_

std::string gum::learning::genericBNLearner::aprioriDbname_
protectedinherited

the filename for the Dirichlet a priori, if any

Definition at line 829 of file genericBNLearner.h.

◆ aprioriType_

AprioriType gum::learning::genericBNLearner::aprioriType_ {AprioriType::NO_APRIORI}
protectedinherited

the a priori selected for the score and parameters

Definition at line 769 of file genericBNLearner.h.

◆ aprioriWeight_

double gum::learning::genericBNLearner::aprioriWeight_ {1.0f}
protectedinherited

the weight of the apriori

Definition at line 777 of file genericBNLearner.h.

◆ constraintForbiddenArcs_

StructuralConstraintForbiddenArcs gum::learning::genericBNLearner::constraintForbiddenArcs_
protectedinherited

the constraint on forbidden arcs

Definition at line 789 of file genericBNLearner.h.

◆ constraintIndegree_

StructuralConstraintIndegree gum::learning::genericBNLearner::constraintIndegree_
protectedinherited

the constraint for indegrees

Definition at line 783 of file genericBNLearner.h.

◆ constraintMandatoryArcs_

StructuralConstraintMandatoryArcs gum::learning::genericBNLearner::constraintMandatoryArcs_
protectedinherited

the constraint on mandatory arcs

Definition at line 795 of file genericBNLearner.h.

◆ constraintPossibleEdges_

StructuralConstraintPossibleEdges gum::learning::genericBNLearner::constraintPossibleEdges_
protectedinherited

the constraint on possible Edges

Definition at line 792 of file genericBNLearner.h.

◆ constraintSliceOrder_

StructuralConstraintSliceOrder gum::learning::genericBNLearner::constraintSliceOrder_
protectedinherited

the constraint for 2TBNs

Definition at line 780 of file genericBNLearner.h.

◆ constraintTabuList_

StructuralConstraintTabuList gum::learning::genericBNLearner::constraintTabuList_
protectedinherited

the constraint for tabu lists

Definition at line 786 of file genericBNLearner.h.

◆ currentAlgorithm_

const ApproximationScheme* gum::learning::genericBNLearner::currentAlgorithm_ {nullptr}
protectedinherited

Definition at line 842 of file genericBNLearner.h.

◆ Dag2BN_

DAG2BNLearner gum::learning::genericBNLearner::Dag2BN_
protectedinherited

the parametric EM

Definition at line 811 of file genericBNLearner.h.

◆ epsilonEM_

double gum::learning::genericBNLearner::epsilonEM_ {0.0}
protectedinherited

epsilon for EM. if espilon=0.0 : no EM

Definition at line 763 of file genericBNLearner.h.

◆ filename_

std::string gum::learning::genericBNLearner::filename_
protectedinherited

the filename database

Definition at line 836 of file genericBNLearner.h.

◆ greedyHillClimbing_

GreedyHillClimbing gum::learning::genericBNLearner::greedyHillClimbing_
protectedinherited

the greedy hill climbing algorithm

Definition at line 814 of file genericBNLearner.h.

◆ initialDag_

DAG gum::learning::genericBNLearner::initialDag_
protectedinherited

an initial DAG given to learners

Definition at line 833 of file genericBNLearner.h.

◆ kmode3Off2_

CorrectedMutualInformation ::KModeTypes gum::learning::genericBNLearner::kmode3Off2_
protectedinherited
Initial value:

the penalty used in 3off2

Definition at line 807 of file genericBNLearner.h.

◆ localSearchWithTabuList_

LocalSearchWithTabuList gum::learning::genericBNLearner::localSearchWithTabuList_
protectedinherited

the local search with tabu list algorithm

Definition at line 817 of file genericBNLearner.h.

◆ mutualInfo_

CorrectedMutualInformation* gum::learning::genericBNLearner::mutualInfo_ {nullptr}
protectedinherited

the selected correction for 3off2 and miic

Definition at line 766 of file genericBNLearner.h.

◆ nbDecreasingChanges_

Size gum::learning::genericBNLearner::nbDecreasingChanges_ {2}
protectedinherited

Definition at line 839 of file genericBNLearner.h.

◆ noApriori_

AprioriNoApriori* gum::learning::genericBNLearner::noApriori_ {nullptr}
protectedinherited

Definition at line 774 of file genericBNLearner.h.

◆ onProgress

Signaler3< Size, double, double > gum::IApproximationSchemeConfiguration::onProgress
inherited

Progression, error and time.

Definition at line 58 of file IApproximationSchemeConfiguration.h.

◆ onStop

Signaler1< std::string > gum::IApproximationSchemeConfiguration::onStop
inherited

Criteria messageApproximationScheme.

Definition at line 61 of file IApproximationSchemeConfiguration.h.

◆ paramEstimatorType_

ParamEstimatorType gum::learning::genericBNLearner::paramEstimatorType_ {ParamEstimatorType::ML}
protectedinherited

the type of the parameter estimator

Definition at line 760 of file genericBNLearner.h.

◆ ranges_

std::vector< std::pair< std::size_t, std::size_t > > gum::learning::genericBNLearner::ranges_
protectedinherited

the set of rows' ranges within the database in which learning is done

Definition at line 823 of file genericBNLearner.h.

◆ score_

Score* gum::learning::genericBNLearner::score_ {nullptr}
protectedinherited

the score used

Definition at line 757 of file genericBNLearner.h.

◆ scoreDatabase_

Database gum::learning::genericBNLearner::scoreDatabase_
protectedinherited

the database to be used by the scores and parameter estimators

Definition at line 820 of file genericBNLearner.h.

◆ scoreType_

ScoreType gum::learning::genericBNLearner::scoreType_ {ScoreType::BDeu}
protectedinherited

the score selected for learning

Definition at line 754 of file genericBNLearner.h.

◆ selectedAlgo_

AlgoType gum::learning::genericBNLearner::selectedAlgo_ {AlgoType::GREEDY_HILL_CLIMBING}
protectedinherited

the selected learning algorithm

Definition at line 798 of file genericBNLearner.h.


The documentation for this class was generated from the following file: