aGrUM  0.13.2
gum::learning::BNLearner< GUM_SCALAR > Class Template Reference

A pack of learning algorithms that can easily be used. More...

#include <BNLearner.h>

+ Inheritance diagram for gum::learning::BNLearner< GUM_SCALAR >:
+ Collaboration diagram for gum::learning::BNLearner< GUM_SCALAR >:

Public Attributes

Signaler3< Size, double, doubleonProgress
 Progression, error and time. More...
 
Signaler1< std::string > onStop
 Criteria messageApproximationScheme. More...
 

Public Member Functions

BayesNet< GUM_SCALAR > learnBN ()
 learn a Bayes Net from a file (must have read the db before) More...
 
BayesNet< GUM_SCALAR > learnParameters (const DAG &dag, bool take_into_account_score=true)
 learns a BN (its parameters) when its structure is known More...
 
BayesNet< GUM_SCALAR > learnParameters (const BayesNet< GUM_SCALAR > &bn, bool take_into_account_score=true)
 learns a BN (its parameters) when its structure is known More...
 
void setMandatoryArcs (const ArcSet &set)
 assign a set of forbidden arcs More...
 
Constructors / Destructors
 BNLearner (const std::string &filename, const std::vector< std::string > &missing_symbols={"?"})
 default constructor More...
 
 BNLearner (const DatabaseTable<> &db)
 default constructor More...
 
 BNLearner (const std::string &filename, const gum::BayesNet< GUM_SCALAR > &src, const std::vector< std::string > &missing_symbols={"?"})
 Read the database file for the score / parameter estimation and var names. More...
 
 BNLearner (const BNLearner &)
 copy constructor More...
 
 BNLearner (BNLearner &&)
 move constructor More...
 
virtual ~BNLearner ()
 destructor More...
 
Operators
BNLearneroperator= (const BNLearner &)
 copy operator More...
 
BNLearneroperator= (BNLearner &&)
 move operator More...
 
Accessors / Modifiers
DAG learnDAG ()
 learn a structure from a file (must have read the db before) More...
 
MixedGraph learnMixedStructure ()
 learn a partial structure from a file (must have read the db before and must have selected miic or 3off2) More...
 
void setInitialDAG (const DAG &)
 sets an initial DAG structure More...
 
const std::vector< std::string > & names () const
 returns the names of the variables in the database More...
 
const std::vector< Size > & modalities () noexcept
 returns the names of the variables in the database More...
 
NodeId idFromName (const std::string &var_name) const
 returns the node id corresponding to a variable name More...
 
const std::string & nameFromId (NodeId id) const
 returns the variable name corresponding to a given node id More...
 
Score selection
void useScoreAIC ()
 indicate that we wish to use an AIC score More...
 
void useScoreBD ()
 indicate that we wish to use a BD score More...
 
void useScoreBDeu ()
 indicate that we wish to use a BDeu score More...
 
void useScoreBIC ()
 indicate that we wish to use a BIC score More...
 
void useScoreK2 ()
 indicate that we wish to use a K2 score More...
 
void useScoreLog2Likelihood ()
 indicate that we wish to use a Log2Likelihood score More...
 
A priori selection / parameterization
void setAprioriWeight (double weight)
 sets the apriori weight More...
 
void useNoApriori ()
 use no apriori More...
 
void useAprioriSmoothing (double weight=-1)
 use the apriori smoothing More...
 
void useAprioriDirichlet (const std::string &filename)
 use the Dirichlet apriori More...
 
Learning algorithm selection
void useGreedyHillClimbing () noexcept
 indicate that we wish to use a greedy hill climbing algorithm More...
 
void useLocalSearchWithTabuList (Size tabu_size=100, Size nb_decrease=2) noexcept
 indicate that we wish to use a local search with tabu list More...
 
void useK2 (const Sequence< NodeId > &order) noexcept
 indicate that we wish to use K2 More...
 
void useK2 (const std::vector< NodeId > &order) noexcept
 indicate that we wish to use K2 More...
 
void use3off2 () noexcept
 indicate that we wish to use 3off2 More...
 
void useMIIC () noexcept
 indicate that we wish to use MIIC More...
 
3off2/MIIC parameterization and specific results
void useNML ()
 indicate that we wish to use the NML correction for 3off2 More...
 
void useMDL ()
 indicate that we wish to use the MDL correction for 3off2 More...
 
void useNoCorr ()
 indicate that we wish to use the NoCorr correction for 3off2 More...
 
const std::vector< ArclatentVariables () const
 get the list of arcs hiding latent variables More...
 
Accessors / Modifiers for adding constraints on learning
void setMaxIndegree (Size max_indegree)
 sets the max indegree More...
 
void setSliceOrder (const NodeProperty< NodeId > &slice_order)
 sets a partial order on the nodes More...
 
void setForbiddenArcs (const ArcSet &set)
 assign a set of forbidden arcs More...
 
assign a new forbidden arc
void addForbiddenArc (const Arc &arc)
 
void addForbiddenArc (const NodeId tail, const NodeId head)
 
void addForbiddenArc (const std::string &tail, const std::string &head)
 
void addMandatoryArc (const Arc &arc)
 
void addMandatoryArc (const NodeId tail, const NodeId head)
 
void addMandatoryArc (const std::string &tail, const std::string &head)
 
remove a forbidden arc
void eraseForbiddenArc (const Arc &arc)
 
void eraseForbiddenArc (const NodeId tail, const NodeId head)
 
void eraseForbiddenArc (const std::string &tail, const std::string &head)
 
void eraseMandatoryArc (const Arc &arc)
 
void eraseMandatoryArc (const NodeId tail, const NodeId head)
 
void eraseMandatoryArc (const std::string &tail, const std::string &head)
 
redistribute signals AND implemenation of interface
INLINE void setCurrentApproximationScheme (const ApproximationScheme *approximationScheme)
 {@ /// distribute signals More...
 
INLINE void distributeProgress (const ApproximationScheme *approximationScheme, Size pourcent, double error, double time)
 {@ /// distribute signals More...
 
INLINE void distributeStop (const ApproximationScheme *approximationScheme, std::string message)
 distribute signals More...
 
void setEpsilon (double eps)
 Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)| If the criterion was disabled it will be enabled. More...
 
double epsilon () const
 Get the value of epsilon. More...
 
void disableEpsilon ()
 Disable stopping criterion on epsilon. More...
 
void enableEpsilon ()
 Enable stopping criterion on epsilon. More...
 
bool isEnabledEpsilon () const
 
void setMinEpsilonRate (double rate)
 Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|) If the criterion was disabled it will be enabled. More...
 
double minEpsilonRate () const
 Get the value of the minimal epsilon rate. More...
 
void disableMinEpsilonRate ()
 Disable stopping criterion on epsilon rate. More...
 
void enableMinEpsilonRate ()
 Enable stopping criterion on epsilon rate. More...
 
bool isEnabledMinEpsilonRate () const
 
void setMaxIter (Size max)
 stopping criterion on number of iterationsIf the criterion was disabled it will be enabled More...
 
Size maxIter () const
 
void disableMaxIter ()
 Disable stopping criterion on max iterations. More...
 
void enableMaxIter ()
 Enable stopping criterion on max iterations. More...
 
bool isEnabledMaxIter () const
 
void setMaxTime (double timeout)
 stopping criterion on timeout If the criterion was disabled it will be enabled More...
 
double maxTime () const
 returns the timeout (in seconds) More...
 
double currentTime () const
 get the current running time in second (double) More...
 
void disableMaxTime ()
 Disable stopping criterion on timeout. More...
 
void enableMaxTime ()
 stopping criterion on timeout If the criterion was disabled it will be enabled More...
 
bool isEnabledMaxTime () const
 
void setPeriodSize (Size p)
 how many samples between 2 stopping isEnableds More...
 
Size periodSize () const
 how many samples between 2 stopping isEnableds More...
 
void setVerbosity (bool v)
 verbosity More...
 
bool verbosity () const
 verbosity More...
 
ApproximationSchemeSTATE stateApproximationScheme () const
 history More...
 
Size nbrIterations () const
 
const std::vector< double > & history () const
 
Getters and setters
std::string messageApproximationScheme () const
 Returns the approximation scheme message. More...
 

Public Types

enum  ScoreType {
  ScoreType::AIC, ScoreType::BD, ScoreType::BDeu, ScoreType::BIC,
  ScoreType::K2, ScoreType::LOG2LIKELIHOOD
}
 an enumeration enabling to select easily the score we wish to use More...
 
enum  ParamEstimatorType { ParamEstimatorType::ML }
 an enumeration to select the type of parameter estimation we shall apply More...
 
enum  AprioriType { AprioriType::NO_APRIORI, AprioriType::SMOOTHING, AprioriType::DIRICHLET_FROM_DATABASE }
 an enumeration to select the apriori More...
 
enum  AlgoType { AlgoType::K2, AlgoType::GREEDY_HILL_CLIMBING, AlgoType::LOCAL_SEARCH_WITH_TABU_LIST, AlgoType::MIIC_THREE_OFF_TWO }
 an enumeration to select easily the learning algorithm to use More...
 
enum  ApproximationSchemeSTATE : char {
  ApproximationSchemeSTATE::Undefined, ApproximationSchemeSTATE::Continue, ApproximationSchemeSTATE::Epsilon, ApproximationSchemeSTATE::Rate,
  ApproximationSchemeSTATE::Limit, ApproximationSchemeSTATE::TimeLimit, ApproximationSchemeSTATE::Stopped
}
 The different state of an approximation scheme. More...
 

Protected Attributes

ScoreType __score_type {ScoreType::BDeu}
 the score selected for learning More...
 
Score__score {nullptr}
 the score used More...
 
ParamEstimatorType __param_estimator_type {ParamEstimatorType::ML}
 the type of the parameter estimator More...
 
ParamEstimator__param_estimator {nullptr}
 the parameter estimator to use More...
 
CorrectedMutualInformation__mutual_info {nullptr}
 the selected correction for 3off2 and miic More...
 
AprioriType __apriori_type {AprioriType::NO_APRIORI}
 the a priori selected for the score and parameters More...
 
Apriori__apriori {nullptr}
 the apriori used More...
 
double __apriori_weight {1.0f}
 the weight of the apriori More...
 
StructuralConstraintSliceOrder __constraint_SliceOrder
 the constraint for 2TBNs More...
 
StructuralConstraintIndegree __constraint_Indegree
 the constraint for indegrees More...
 
StructuralConstraintTabuList __constraint_TabuList
 the constraint for tabu lists More...
 
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
 the constraint on forbidden arcs More...
 
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
 the constraint on forbidden arcs More...
 
AlgoType __selected_algo {AlgoType::GREEDY_HILL_CLIMBING}
 the selected learning algorithm More...
 
K2 __K2
 the K2 algorithm More...
 
Miic __miic_3off2
 the 3off2 algorithm More...
 
GreedyHillClimbing __greedy_hill_climbing
 the greedy hill climbing algorithm More...
 
LocalSearchWithTabuList __local_search_with_tabu_list
 the local search with tabu list algorithm More...
 
Database __score_database
 the database to be used by the scores and parameter estimators More...
 
NodeProperty< Sequence< std::string > > __user_modalities
 indicates the values the user specified for the translators More...
 
bool __modalities_parse_db {false}
 indicates whether we shall parse the database to update __user_modalities More...
 
Database__apriori_database {nullptr}
 the database used by the Dirichlet a priori More...
 
std::string __apriori_dbname
 the filename for the Dirichlet a priori, if any More...
 
DAG __initial_dag
 an initial DAG given to learners More...
 
const ApproximationScheme__current_algorithm {nullptr}
 

Protected Member Functions

void __createApriori ()
 create the apriori used for learning More...
 
void __createScore ()
 create the score used for learning More...
 
void __createParamEstimator (bool take_into_account_score=true)
 create the parameter estimator used for learning More...
 
DAG __learnDAG ()
 returns the DAG learnt More...
 
MixedGraph __prepare_miic_3off2 ()
 prepares the initial graph for 3off2 or miic More...
 
bool __checkScoreAprioriCompatibility ()
 checks whether the current score and apriori are compatible More...
 
const std::string & __getAprioriType () const
 returns the type (as a string) of a given apriori More...
 

Static Protected Member Functions

static DatabaseTable __readFile (const std::string &filename, const std::vector< std::string > &missing_symbols)
 reads a file and returns a databaseVectInRam More...
 
static void __checkFileName (const std::string &filename)
 checks whether the extension of a CSV filename is correct More...
 

Detailed Description

template<typename GUM_SCALAR>
class gum::learning::BNLearner< GUM_SCALAR >

A pack of learning algorithms that can easily be used.

The pack currently contains K2, GreedyHillClimbing and LocalSearchWithTabuList

Definition at line 55 of file BNLearner.h.

Member Enumeration Documentation

an enumeration to select easily the learning algorithm to use

Enumerator
K2 
GREEDY_HILL_CLIMBING 
LOCAL_SEARCH_WITH_TABU_LIST 
MIIC_THREE_OFF_TWO 

Definition at line 112 of file genericBNLearner.h.

112  {
113  K2,
114  GREEDY_HILL_CLIMBING,
115  LOCAL_SEARCH_WITH_TABU_LIST,
116  MIIC_THREE_OFF_TWO
117  };

The different state of an approximation scheme.

Enumerator
Undefined 
Continue 
Epsilon 
Rate 
Limit 
TimeLimit 
Stopped 

Definition at line 64 of file IApproximationSchemeConfiguration.h.

64  : char {
65  Undefined,
66  Continue,
67  Epsilon,
68  Rate,
69  Limit,
70  TimeLimit,
71  Stopped
72  };

an enumeration to select the apriori

Enumerator
NO_APRIORI 
SMOOTHING 
DIRICHLET_FROM_DATABASE 

Definition at line 109 of file genericBNLearner.h.

109 { NO_APRIORI, SMOOTHING, DIRICHLET_FROM_DATABASE };

an enumeration to select the type of parameter estimation we shall apply

Enumerator
ML 

Definition at line 106 of file genericBNLearner.h.

106 { ML };

an enumeration enabling to select easily the score we wish to use

Enumerator
AIC 
BD 
BDeu 
BIC 
K2 
LOG2LIKELIHOOD 

Definition at line 102 of file genericBNLearner.h.

102 { AIC, BD, BDeu, BIC, K2, LOG2LIKELIHOOD };

Constructor & Destructor Documentation

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const std::string &  filename,
const std::vector< std::string > &  missing_symbols = {"?"} 
)

default constructor

read the database file for the score / parameter estimation and var names

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const DatabaseTable<> &  db)

default constructor

read the database file for the score / parameter estimation and var names

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const std::string &  filename,
const gum::BayesNet< GUM_SCALAR > &  src,
const std::vector< std::string > &  missing_symbols = {"?"} 
)

Read the database file for the score / parameter estimation and var names.

If modalities = { 1 -> {True, False, Big} }, then the node of id 1 in the BN will have 3 modalities, the first one being True, the second one being False, and the third bein Big.

A parsing of the database will allow to determine which ones are really necessary and will keep them in the order specified by the user (NodeProperty modalities). If parse_database is set to false (the default), then the modalities specified by the user will be considered as being exactly those of the variables of the BN (as a consequence, if we find other values in the database, an exception will be raised during learning).

Parameters
filenameThe file to learn from.
modalitiesindicate for some nodes (not necessarily all the nodes of the BN) which modalities they should have and in which order these modalities should be stored into the nodes.
parse_databaseif true, the modalities specified by the user will be considered as a superset of the modalities of the variables. Wrapper for BNLearner (filename,modalities,parse_database) using a bn to find those modalities and nodeids.
template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( const BNLearner< GUM_SCALAR > &  )

copy constructor

template<typename GUM_SCALAR >
gum::learning::BNLearner< GUM_SCALAR >::BNLearner ( BNLearner< GUM_SCALAR > &&  )

move constructor

template<typename GUM_SCALAR >
virtual gum::learning::BNLearner< GUM_SCALAR >::~BNLearner ( )
virtual

destructor

Member Function Documentation

void gum::learning::genericBNLearner::__checkFileName ( const std::string &  filename)
staticprotectedinherited

checks whether the extension of a CSV filename is correct

Definition at line 431 of file genericBNLearner.cpp.

References GUM_ERROR.

Referenced by gum::learning::genericBNLearner::Database::Database().

431  {
432  // get the extension of the file
433  Size filename_size = Size(filename.size());
434 
435  if (filename_size < 4) {
436  GUM_ERROR(FormatNotFound,
437  "genericBNLearner could not determine the "
438  "file type of the database");
439  }
440 
441  std::string extension = filename.substr(filename.size() - 4);
442  std::transform(
443  extension.begin(), extension.end(), extension.begin(), ::tolower);
444 
445  if (extension != ".csv") {
446  GUM_ERROR(
447  OperationNotAllowed,
448  "genericBNLearner does not support yet this type of database file");
449  }
450  }
unsigned long Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:50
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the caller graph for this function:

bool gum::learning::genericBNLearner::__checkScoreAprioriCompatibility ( )
protectedinherited

checks whether the current score and apriori are compatible

Returns
true if the apriori is compatible with the score.
Exceptions
IncompatibleScoreAprioriis raised if the apriori is known to be incompatible with the score. Such a case usually arises because the score already implicitly contains an apriori which should not be combined with the apriori passed in argument. aGrUM will nevertheless allow you to use this apriori with the score, but you should be warned that the result of learning will most probably be meaningless.
PossiblyIncompatibleScoreAprioriis raised if, in general, the apriori is incompatible with the score but, with its current weight, it becomes compatible (e.g., a Dirichlet apriori with a 0-weight is the same as a NoApriori). In such a case, you should not modify the weight. aGrUM will allow you to do so but the result of learning will most probably be meaningless.
InvalidArgumentis raised if the apriori is not handled yet by method isAprioriCompatible (the method needs be updated to take it into account).

Definition at line 835 of file genericBNLearner.cpp.

References gum::learning::genericBNLearner::__apriori_weight, gum::learning::genericBNLearner::__getAprioriType(), gum::learning::genericBNLearner::__score_type, gum::learning::genericBNLearner::AIC, gum::learning::genericBNLearner::BD, gum::learning::genericBNLearner::BDeu, gum::learning::genericBNLearner::BIC, GUM_ERROR, gum::learning::ScoreBIC< IdSetAlloc, CountAlloc >::isAprioriCompatible(), gum::learning::ScoreLog2Likelihood< IdSetAlloc, CountAlloc >::isAprioriCompatible(), gum::learning::ScoreAIC< IdSetAlloc, CountAlloc >::isAprioriCompatible(), gum::learning::ScoreBD< IdSetAlloc, CountAlloc >::isAprioriCompatible(), gum::learning::ScoreK2< IdSetAlloc, CountAlloc >::isAprioriCompatible(), gum::learning::ScoreBDeu< IdSetAlloc, CountAlloc >::isAprioriCompatible(), gum::learning::genericBNLearner::K2, and gum::learning::genericBNLearner::LOG2LIKELIHOOD.

Referenced by gum::learning::genericBNLearner::setAprioriWeight(), gum::learning::genericBNLearner::useAprioriDirichlet(), gum::learning::genericBNLearner::useAprioriSmoothing(), gum::learning::genericBNLearner::useNoApriori(), gum::learning::genericBNLearner::useScoreAIC(), gum::learning::genericBNLearner::useScoreBD(), gum::learning::genericBNLearner::useScoreBDeu(), gum::learning::genericBNLearner::useScoreBIC(), gum::learning::genericBNLearner::useScoreK2(), and gum::learning::genericBNLearner::useScoreLog2Likelihood().

835  {
836  const std::string& apriori = __getAprioriType();
837 
838  switch (__score_type) {
839  case ScoreType::AIC:
841 
842  case ScoreType::BD:
844 
845  case ScoreType::BDeu:
847 
848  case ScoreType::BIC:
850 
851  case ScoreType::K2:
853 
857 
858  default:
859  GUM_ERROR(OperationNotAllowed,
860  "genericBNLearner does not support yet this score");
861  }
862  }
virtual bool isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual bool isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual bool isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual bool isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
const std::string & __getAprioriType() const
returns the type (as a string) of a given apriori
ScoreType __score_type
the score selected for learning
double __apriori_weight
the weight of the apriori
virtual bool isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
virtual bool isAprioriCompatible() const final
indicates whether the apriori is compatible (meaningful) with the score
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

void gum::learning::genericBNLearner::__createApriori ( )
protectedinherited

create the apriori used for learning

Definition at line 503 of file genericBNLearner.cpp.

References gum::learning::genericBNLearner::__apriori, gum::learning::genericBNLearner::__apriori_database, gum::learning::genericBNLearner::__apriori_dbname, gum::learning::genericBNLearner::__apriori_type, gum::learning::genericBNLearner::__apriori_weight, gum::learning::genericBNLearner::__score_database, gum::learning::genericBNLearner::__user_modalities, gum::learning::genericBNLearner::DIRICHLET_FROM_DATABASE, GUM_ERROR, gum::learning::genericBNLearner::Database::missingSymbols(), gum::learning::genericBNLearner::Database::modalities(), gum::learning::genericBNLearner::NO_APRIORI, gum::learning::genericBNLearner::Database::parser(), gum::learning::Apriori< IdSetAlloc, CountAlloc >::setWeight(), and gum::learning::genericBNLearner::SMOOTHING.

Referenced by gum::learning::genericBNLearner::learnDAG().

503  {
504  // first, save the old apriori, to be delete if everything is ok
505  Apriori<>* old_apriori = __apriori;
506 
507  // create the new apriori
508  switch (__apriori_type) {
509  case AprioriType::NO_APRIORI: __apriori = new AprioriNoApriori<>; break;
510 
511  case AprioriType::SMOOTHING: __apriori = new AprioriSmoothing<>; break;
512 
514  if (__apriori_database != nullptr) {
515  delete __apriori_database;
516  __apriori_database = nullptr;
517  }
518 
519  if (__user_modalities.empty()) {
520  __apriori_database = new Database(__apriori_dbname,
523  } else {
524  GUM_ERROR(OperationNotAllowed, "not implemented");
525  //__apriori_database =
526  // new Database(__apriori_dbname, __score_database,
527  // __user_modalities);
528  }
529 
530  __apriori = new AprioriDirichletFromDatabase<>(
532  break;
533 
534  default:
535  GUM_ERROR(OperationNotAllowed,
536  "genericBNLearner does not support yet this apriori");
537  }
538 
539  // do not forget to assign a weight to the apriori
541 
542  // remove the old apriori, if any
543  if (old_apriori != nullptr) delete old_apriori;
544  }
Database __score_database
the database to be used by the scores and parameter estimators
std::vector< Size > & modalities() noexcept
returns the modalities of the variables
virtual void setWeight(double weight)
sets the weight of the a priori (kind of effective sample size)
AprioriType __apriori_type
the a priori selected for the score and parameters
Database * __apriori_database
the database used by the Dirichlet a priori
Apriori * __apriori
the apriori used
std::string __apriori_dbname
the filename for the Dirichlet a priori, if any
NodeProperty< Sequence< std::string > > __user_modalities
indicates the values the user specified for the translators
DBRowGeneratorParser & parser()
returns the parser for the database
double __apriori_weight
the weight of the apriori
const std::vector< std::string > & missingSymbols() const
returns the set of missing symbols taken into account
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

void gum::learning::genericBNLearner::__createParamEstimator ( bool  take_into_account_score = true)
protectedinherited

create the parameter estimator used for learning

Definition at line 591 of file genericBNLearner.cpp.

References gum::learning::genericBNLearner::__apriori, gum::learning::genericBNLearner::__param_estimator, gum::learning::genericBNLearner::__param_estimator_type, gum::learning::genericBNLearner::__score, gum::learning::genericBNLearner::__score_database, GUM_ERROR, gum::learning::Score< IdSetAlloc, CountAlloc >::internalApriori(), gum::learning::genericBNLearner::ML, gum::learning::genericBNLearner::Database::modalities(), and gum::learning::genericBNLearner::Database::parser().

591  {
592  // first, save the old estimator, to be delete if everything is ok
593  ParamEstimator<>* old_estimator = __param_estimator;
594 
595  // create the new estimator
596  switch (__param_estimator_type) {
598  if (take_into_account_score && (__score != nullptr)) {
600  new ParamEstimatorML<>(__score_database.parser(),
602  *__apriori,
604  } else {
606  new ParamEstimatorML<>(__score_database.parser(),
608  *__apriori);
609  }
610 
611  break;
612 
613  default:
614  GUM_ERROR(OperationNotAllowed,
615  "genericBNLearner does not support "
616  "yet this parameter estimator");
617  }
618 
619  // remove the old estimator, if any
620  if (old_estimator != nullptr) delete old_estimator;
621  }
Score * __score
the score used
Database __score_database
the database to be used by the scores and parameter estimators
std::vector< Size > & modalities() noexcept
returns the modalities of the variables
ParamEstimatorType __param_estimator_type
the type of the parameter estimator
ParamEstimator * __param_estimator
the parameter estimator to use
Apriori * __apriori
the apriori used
DBRowGeneratorParser & parser()
returns the parser for the database
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66
virtual const ScoreInternalApriori< IdSetAlloc, CountAlloc > & internalApriori() const noexcept=0
returns the internal apriori of the score

+ Here is the call graph for this function:

void gum::learning::genericBNLearner::__createScore ( )
protectedinherited

create the score used for learning

Definition at line 546 of file genericBNLearner.cpp.

References gum::learning::genericBNLearner::__apriori, gum::learning::genericBNLearner::__score, gum::learning::genericBNLearner::__score_database, gum::learning::genericBNLearner::__score_type, gum::learning::genericBNLearner::AIC, gum::learning::genericBNLearner::BD, gum::learning::genericBNLearner::BDeu, gum::learning::genericBNLearner::BIC, GUM_ERROR, gum::learning::genericBNLearner::K2, gum::learning::genericBNLearner::LOG2LIKELIHOOD, gum::learning::genericBNLearner::Database::modalities(), and gum::learning::genericBNLearner::Database::parser().

Referenced by gum::learning::genericBNLearner::learnDAG().

546  {
547  // first, save the old score, to be delete if everything is ok
548  Score<>* old_score = __score;
549 
550  // create the new scoring function
551  switch (__score_type) {
552  case ScoreType::AIC:
553  __score = new ScoreAIC<>(
555  break;
556 
557  case ScoreType::BD:
558  __score = new ScoreBD<>(
560  break;
561 
562  case ScoreType::BDeu:
563  __score = new ScoreBDeu<>(
565  break;
566 
567  case ScoreType::BIC:
568  __score = new ScoreBIC<>(
570  break;
571 
572  case ScoreType::K2:
573  __score = new ScoreK2<>(
575  break;
576 
578  __score = new ScoreLog2Likelihood<>(
580  break;
581 
582  default:
583  GUM_ERROR(OperationNotAllowed,
584  "genericBNLearner does not support yet this score");
585  }
586 
587  // remove the old score, if any
588  if (old_score != nullptr) delete old_score;
589  }
Score * __score
the score used
Database __score_database
the database to be used by the scores and parameter estimators
std::vector< Size > & modalities() noexcept
returns the modalities of the variables
Apriori * __apriori
the apriori used
ScoreType __score_type
the score selected for learning
DBRowGeneratorParser & parser()
returns the parser for the database
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

INLINE const std::string & gum::learning::genericBNLearner::__getAprioriType ( ) const
protectedinherited

returns the type (as a string) of a given apriori

Definition at line 334 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__apriori_type, gum::learning::genericBNLearner::DIRICHLET_FROM_DATABASE, GUM_ERROR, gum::learning::genericBNLearner::NO_APRIORI, and gum::learning::genericBNLearner::SMOOTHING.

Referenced by gum::learning::genericBNLearner::__checkScoreAprioriCompatibility().

334  {
335  switch (__apriori_type) {
337 
339 
342 
343  default:
344  GUM_ERROR(OperationNotAllowed,
345  "genericBNLearner getAprioriType does "
346  "not support yet this apriori");
347  }
348  }
static const std::string type
Definition: aprioriTypes.h:40
static const std::string type
Definition: aprioriTypes.h:45
AprioriType __apriori_type
the a priori selected for the score and parameters
static const std::string type
Definition: aprioriTypes.h:35
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the caller graph for this function:

template<typename GUM_SCALAR >
NodeProperty< Sequence< std::string > > gum::learning::BNLearner< GUM_SCALAR >::__labelsFromBN ( const std::string &  filename,
const BayesNet< GUM_SCALAR > &  src 
)
private

read the first line of a file to find column names

DAG gum::learning::genericBNLearner::__learnDAG ( )
protectedinherited

returns the DAG learnt

Definition at line 670 of file genericBNLearner.cpp.

References gum::learning::genericBNLearner::__constraint_ForbiddenArcs, gum::learning::genericBNLearner::__constraint_Indegree, gum::learning::genericBNLearner::__constraint_MandatoryArcs, gum::learning::genericBNLearner::__constraint_SliceOrder, gum::learning::genericBNLearner::__constraint_TabuList, gum::learning::genericBNLearner::__greedy_hill_climbing, gum::learning::genericBNLearner::__initial_dag, gum::learning::genericBNLearner::__K2, gum::learning::genericBNLearner::__local_search_with_tabu_list, gum::learning::genericBNLearner::__miic_3off2, gum::learning::genericBNLearner::__mutual_info, gum::learning::genericBNLearner::__prepare_miic_3off2(), gum::learning::genericBNLearner::__score, gum::learning::genericBNLearner::__score_database, gum::learning::genericBNLearner::__selected_algo, gum::DAG::addArc(), gum::NodeGraphPart::addNodeWithId(), gum::learning::K2::approximationScheme(), gum::learning::StructuralConstraintForbiddenArcs::arcs(), gum::learning::StructuralConstraintMandatoryArcs::arcs(), gum::ArcGraphPart::eraseArc(), gum::NodeGraphPart::exists(), gum::learning::genericBNLearner::GREEDY_HILL_CLIMBING, GUM_ERROR, gum::learning::genericBNLearner::K2, gum::learning::K2::learnStructure(), gum::learning::GreedyHillClimbing::learnStructure(), gum::learning::LocalSearchWithTabuList::learnStructure(), gum::learning::Miic::learnStructure(), gum::learning::genericBNLearner::LOCAL_SEARCH_WITH_TABU_LIST, gum::learning::genericBNLearner::MIIC_THREE_OFF_TWO, gum::learning::genericBNLearner::Database::modalities(), gum::learning::K2::order(), and gum::SequenceImplementation< Key, Alloc, Gen >::pos().

Referenced by gum::learning::genericBNLearner::learnDAG().

670  {
671  // add the mandatory arcs to the initial dag and remove the forbidden ones
672  // from the initial graph
673  DAG init_graph = __initial_dag;
674 
675  const ArcSet& mandatory_arcs = __constraint_MandatoryArcs.arcs();
676 
677  for (const auto& arc : mandatory_arcs) {
678  if (!init_graph.exists(arc.tail())) init_graph.addNodeWithId(arc.tail());
679 
680  if (!init_graph.exists(arc.head())) init_graph.addNodeWithId(arc.head());
681 
682  init_graph.addArc(arc.tail(), arc.head());
683  }
684 
685  const ArcSet& forbidden_arcs = __constraint_ForbiddenArcs.arcs();
686 
687  for (const auto& arc : forbidden_arcs) {
688  init_graph.eraseArc(arc);
689  }
690 
691  switch (__selected_algo) {
692  // ========================================================================
694  BNLearnerListener listener(this, __miic_3off2);
695  // create the mixedGraph
696  MixedGraph mgraph = this->__prepare_miic_3off2();
697 
698  return __miic_3off2.learnStructure(*__mutual_info, mgraph);
699  }
700  // ========================================================================
702  BNLearnerListener listener(this, __greedy_hill_climbing);
703  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
704  StructuralConstraintForbiddenArcs,
705  StructuralConstraintSliceOrder >
706  gen_constraint;
707  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint) =
709  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint) =
711  static_cast< StructuralConstraintSliceOrder& >(gen_constraint) =
713 
714  GraphChangesGenerator4DiGraph< decltype(gen_constraint) > op_set(
715  gen_constraint);
716 
717  StructuralConstraintSetStatic< StructuralConstraintIndegree,
718  StructuralConstraintDAG >
719  sel_constraint;
720  static_cast< StructuralConstraintIndegree& >(sel_constraint) =
722 
723  GraphChangesSelector4DiGraph< Score<>,
724  decltype(sel_constraint),
725  decltype(op_set) >
726  selector(*__score, sel_constraint, op_set);
727 
729  selector, __score_database.modalities(), init_graph);
730  }
731 
732  // ========================================================================
734  BNLearnerListener listener(this, __local_search_with_tabu_list);
735  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
736  StructuralConstraintForbiddenArcs,
737  StructuralConstraintSliceOrder >
738  gen_constraint;
739  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint) =
741  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint) =
743  static_cast< StructuralConstraintSliceOrder& >(gen_constraint) =
745 
746  GraphChangesGenerator4DiGraph< decltype(gen_constraint) > op_set(
747  gen_constraint);
748 
749  StructuralConstraintSetStatic< StructuralConstraintTabuList,
750  StructuralConstraintIndegree,
751  StructuralConstraintDAG >
752  sel_constraint;
753  static_cast< StructuralConstraintTabuList& >(sel_constraint) =
755  static_cast< StructuralConstraintIndegree& >(sel_constraint) =
757 
758  GraphChangesSelector4DiGraph< Score<>,
759  decltype(sel_constraint),
760  decltype(op_set) >
761  selector(*__score, sel_constraint, op_set);
762 
764  selector, __score_database.modalities(), init_graph);
765  }
766 
767  // ========================================================================
768  case AlgoType::K2: {
769  BNLearnerListener listener(this, __K2.approximationScheme());
770  StructuralConstraintSetStatic< StructuralConstraintMandatoryArcs,
771  StructuralConstraintForbiddenArcs >
772  gen_constraint;
773  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint) =
775  static_cast< StructuralConstraintForbiddenArcs& >(gen_constraint) =
777 
778  GraphChangesGenerator4K2< decltype(gen_constraint) > op_set(
779  gen_constraint);
780 
781  // if some mandatory arcs are incompatible with the order, use a DAG
782  // constraint instead of a DiGraph constraint to avoid cycles
783  const ArcSet& mandatory_arcs =
784  static_cast< StructuralConstraintMandatoryArcs& >(gen_constraint)
785  .arcs();
786  const Sequence< NodeId >& order = __K2.order();
787  bool order_compatible = true;
788 
789  for (const auto& arc : mandatory_arcs) {
790  if (order.pos(arc.tail()) >= order.pos(arc.head())) {
791  order_compatible = false;
792  break;
793  }
794  }
795 
796  if (order_compatible) {
797  StructuralConstraintSetStatic< StructuralConstraintIndegree,
798  StructuralConstraintDiGraph >
799  sel_constraint;
800  static_cast< StructuralConstraintIndegree& >(sel_constraint) =
802 
803  GraphChangesSelector4DiGraph< Score<>,
804  decltype(sel_constraint),
805  decltype(op_set) >
806  selector(*__score, sel_constraint, op_set);
807 
808  return __K2.learnStructure(
809  selector, __score_database.modalities(), init_graph);
810  } else {
811  StructuralConstraintSetStatic< StructuralConstraintIndegree,
812  StructuralConstraintDAG >
813  sel_constraint;
814  static_cast< StructuralConstraintIndegree& >(sel_constraint) =
816 
817  GraphChangesSelector4DiGraph< Score<>,
818  decltype(sel_constraint),
819  decltype(op_set) >
820  selector(*__score, sel_constraint, op_set);
821 
822  return __K2.learnStructure(
823  selector, __score_database.modalities(), init_graph);
824  }
825  }
826 
827  // ========================================================================
828  default:
829  GUM_ERROR(OperationNotAllowed,
830  "the learnDAG method has not been implemented for this "
831  "learning algorithm");
832  }
833  }
AlgoType __selected_algo
the selected learning algorithm
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
Score * __score
the score used
Database __score_database
the database to be used by the scores and parameter estimators
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, const std::vector< Size > &modal, DAG initial_dag=DAG())
learns the structure of a Bayes net
StructuralConstraintSliceOrder __constraint_SliceOrder
the constraint for 2TBNs
Miic __miic_3off2
the 3off2 algorithm
std::vector< Size > & modalities() noexcept
returns the modalities of the variables
Set< Arc > ArcSet
Some typdefs and define for shortcuts ...
MixedGraph __prepare_miic_3off2()
prepares the initial graph for 3off2 or miic
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
DAG __initial_dag
an initial DAG given to learners
const Sequence< NodeId > & order() const noexcept
returns the current order
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, const std::vector< Size > &modal, DAG initial_dag=DAG())
learns the structure of a Bayes net
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
const ArcSet & arcs() const
returns the set of mandatory arcs
StructuralConstraintTabuList __constraint_TabuList
the constraint for tabu lists
DAG learnStructure(GRAPH_CHANGES_SELECTOR &selector, const std::vector< Size > &modal, DAG initial_dag=DAG())
learns the structure of a Bayes net
Definition: K2_tpl.h:38
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
StructuralConstraintIndegree __constraint_Indegree
the constraint for indegrees
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
Idx pos(const Key &key) const
Returns the position of the object passed in argument (if it exists).
Definition: sequence_tpl.h:515
DAG learnStructure(CorrectedMutualInformation<> &I, MixedGraph graph)
learns the structure of an Bayesian network, ie a DAG, by first learning an Essential graph and then ...
Definition: Miic.cpp:944
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66
const ArcSet & arcs() const
returns the set of mandatory arcs

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

MixedGraph gum::learning::genericBNLearner::__prepare_miic_3off2 ( )
protectedinherited

prepares the initial graph for 3off2 or miic

Definition at line 624 of file genericBNLearner.cpp.

References gum::learning::genericBNLearner::__constraint_ForbiddenArcs, gum::learning::genericBNLearner::__constraint_MandatoryArcs, gum::learning::genericBNLearner::__miic_3off2, gum::learning::genericBNLearner::__mutual_info, gum::learning::genericBNLearner::__score_database, gum::learning::Miic::addConstraints(), gum::UndiGraph::addEdge(), gum::NodeGraphPart::addNodeWithId(), gum::learning::StructuralConstraintMandatoryArcs::arcs(), gum::learning::StructuralConstraintForbiddenArcs::arcs(), gum::HashTable< Key, Val, Alloc >::insert(), gum::learning::genericBNLearner::Database::modalities(), and gum::learning::genericBNLearner::useNML().

Referenced by gum::learning::genericBNLearner::__learnDAG(), and gum::learning::genericBNLearner::learnMixedStructure().

624  {
625  // Initialize the mixed graph to the fully connected graph
626  MixedGraph mgraph;
627  for (Size i = 0; i < __score_database.modalities().size(); ++i) {
628  mgraph.addNodeWithId(i);
629  for (Size j = 0; j < i; ++j) {
630  mgraph.addEdge(j, i);
631  }
632  }
633 
634  // translating the constraints for 3off2 or miic
635  HashTable< std::pair< Idx, Idx >, char > initial_marks;
636  const ArcSet& mandatory_arcs = __constraint_MandatoryArcs.arcs();
637  for (const auto& arc : mandatory_arcs) {
638  initial_marks.insert({arc.tail(), arc.head()}, '>');
639  }
640 
641  const ArcSet& forbidden_arcs = __constraint_ForbiddenArcs.arcs();
642  for (const auto& arc : forbidden_arcs) {
643  initial_marks.insert({arc.tail(), arc.head()}, '-');
644  }
645  __miic_3off2.addConstraints(initial_marks);
646  // create the mutual entropy object
647  if (__mutual_info == nullptr) { this->useNML(); }
648 
649  return mgraph;
650  }
unsigned long Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:50
void addConstraints(HashTable< std::pair< Idx, Idx >, char > constraints)
Set a ensemble of constraints for the orientation phase.
Definition: Miic.cpp:1034
Database __score_database
the database to be used by the scores and parameter estimators
Miic __miic_3off2
the 3off2 algorithm
std::vector< Size > & modalities() noexcept
returns the modalities of the variables
Set< Arc > ArcSet
Some typdefs and define for shortcuts ...
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
void useNML()
indicate that we wish to use the NML correction for 3off2
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
const ArcSet & arcs() const
returns the set of mandatory arcs
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs
void insert(const Key &k)
Inserts a new element into the set.
Definition: set_tpl.h:613
const ArcSet & arcs() const
returns the set of mandatory arcs

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

DatabaseTable gum::learning::genericBNLearner::__readFile ( const std::string &  filename,
const std::vector< std::string > &  missing_symbols 
)
staticprotectedinherited

reads a file and returns a databaseVectInRam

Definition at line 453 of file genericBNLearner.cpp.

References gum::learning::IDBInitializer< ALLOC >::fillDatabase(), GUM_ERROR, gum::learning::IDatabaseTable< DBTranslatedValue, ALLOC >::hasMissingValues(), gum::learning::DBTranslatorSet< ALLOC >::insertTranslator(), gum::learning::DatabaseTable< ALLOC >::reorder(), gum::learning::DatabaseTable< ALLOC >::setVariableNames(), and gum::learning::IDBInitializer< ALLOC >::variableNames().

455  {
456  // get the extension of the file
457  Size filename_size = Size(filename.size());
458 
459  if (filename_size < 4) {
460  GUM_ERROR(FormatNotFound,
461  "genericBNLearner could not determine the "
462  "file type of the database");
463  }
464 
465  std::string extension = filename.substr(filename.size() - 4);
466  std::transform(
467  extension.begin(), extension.end(), extension.begin(), ::tolower);
468 
469  if (extension != ".csv") {
470  GUM_ERROR(
471  OperationNotAllowed,
472  "genericBNLearner does not support yet this type of database file");
473  }
474 
475 
476  DBInitializerFromCSV<> initializer(filename);
477 
478  const auto& var_names = initializer.variableNames();
479  const std::size_t nb_vars = var_names.size();
480 
481  DBTranslatorSet<> translator_set;
482  DBTranslator4LabelizedVariable<> translator(missing_symbols);
483  for (std::size_t i = 0; i < nb_vars; ++i) {
484  translator_set.insertTranslator(translator, i);
485  }
486 
487  DatabaseTable<> database(missing_symbols, translator_set);
488  database.setVariableNames(initializer.variableNames());
489  initializer.fillDatabase(database);
490 
491  // check that the database does not contain any missing value
492  if (database.hasMissingValues())
493  GUM_ERROR(MissingValueInDatabase,
494  "For the moment, the BNLearaner is unable to cope "
495  "with missing values in databases");
496 
497  database.reorder();
498 
499  return database;
500  }
unsigned long Size
In aGrUM, hashed values are unsigned long int.
Definition: types.h:50
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const Arc arc)
inherited

Definition at line 222 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__constraint_ForbiddenArcs, and gum::learning::StructuralConstraintForbiddenArcs::addArc().

Referenced by gum::learning::genericBNLearner::addForbiddenArc().

222  {
224  }
void addArc(const Arc &arc)
assign a new forbidden arc
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 232 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::addForbiddenArc().

233  {
234  addForbiddenArc(Arc(tail, head));
235  }

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::addForbiddenArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 244 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::addForbiddenArc(), and gum::learning::genericBNLearner::Database::idFromName().

245  {
246  addForbiddenArc(Arc(idFromName(tail), idFromName(head)));
247  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const Arc arc)
inherited

Definition at line 261 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__constraint_MandatoryArcs, and gum::learning::StructuralConstraintMandatoryArcs::addArc().

Referenced by gum::learning::genericBNLearner::addMandatoryArc().

261  {
263  }
void addArc(const Arc &arc)
assign a new forbidden arc
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 283 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::addMandatoryArc().

284  {
285  addMandatoryArc(Arc(tail, head));
286  }

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::addMandatoryArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 271 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::addMandatoryArc(), and gum::learning::genericBNLearner::Database::idFromName().

272  {
273  addMandatoryArc(Arc(idFromName(tail), idFromName(head)));
274  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name

+ Here is the call graph for this function:

double gum::learning::genericBNLearner::currentTime ( ) const
inlinevirtualinherited

get the current running time in second (double)

Implements gum::IApproximationSchemeConfiguration.

Definition at line 815 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm, gum::ApproximationScheme::currentTime(), and GUM_ERROR.

815  {
816  if (__current_algorithm != nullptr)
818  else
819  GUM_ERROR(FatalError, "No chosen algorithm for learning");
820  };
double currentTime() const
Returns the current running time in second.
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

void gum::learning::genericBNLearner::disableEpsilon ( )
inlinevirtualinherited

Disable stopping criterion on epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 689 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::disableEpsilon().

689  {
693  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void disableEpsilon()
Disable stopping criterion on epsilon.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm

+ Here is the call graph for this function:

void gum::learning::genericBNLearner::disableMaxIter ( )
inlinevirtualinherited

Disable stopping criterion on max iterations.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 773 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::disableMaxIter().

773  {
777  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
void disableMaxIter()
Disable stopping criterion on max iterations.

+ Here is the call graph for this function:

void gum::learning::genericBNLearner::disableMaxTime ( )
inlinevirtualinherited

Disable stopping criterion on timeout.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 823 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::disableMaxTime().

823  {
827  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
void disableMaxTime()
Disable stopping criterion on timeout.
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm

+ Here is the call graph for this function:

void gum::learning::genericBNLearner::disableMinEpsilonRate ( )
inlinevirtualinherited

Disable stopping criterion on epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 732 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::disableMinEpsilonRate().

732  {
736  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void disableMinEpsilonRate()
Disable stopping criterion on epsilon rate.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::distributeProgress ( const ApproximationScheme approximationScheme,
Size  pourcent,
double  error,
double  time 
)
inlineinherited

{@ /// distribute signals

Definition at line 652 of file genericBNLearner.h.

References GUM_EMIT3, gum::IApproximationSchemeConfiguration::onProgress, and gum::learning::genericBNLearner::setCurrentApproximationScheme().

Referenced by gum::learning::BNLearnerListener::whenProgress().

655  {
656  setCurrentApproximationScheme(approximationScheme);
657 
658  if (onProgress.hasListener()) GUM_EMIT3(onProgress, pourcent, error, time);
659  };
INLINE void setCurrentApproximationScheme(const ApproximationScheme *approximationScheme)
{@ /// distribute signals
Signaler3< Size, double, double > onProgress
Progression, error and time.
#define GUM_EMIT3(signal, arg1, arg2, arg3)
Definition: signaler3.h:40

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

INLINE void gum::learning::genericBNLearner::distributeStop ( const ApproximationScheme approximationScheme,
std::string  message 
)
inlineinherited

distribute signals

Definition at line 662 of file genericBNLearner.h.

References GUM_EMIT1, gum::IApproximationSchemeConfiguration::onStop, and gum::learning::genericBNLearner::setCurrentApproximationScheme().

Referenced by gum::learning::BNLearnerListener::whenStop().

663  {
664  setCurrentApproximationScheme(approximationScheme);
665 
666  if (onStop.hasListener()) GUM_EMIT1(onStop, message);
667  };
INLINE void setCurrentApproximationScheme(const ApproximationScheme *approximationScheme)
{@ /// distribute signals
#define GUM_EMIT1(signal, arg1)
Definition: signaler1.h:40
Signaler1< std::string > onStop
Criteria messageApproximationScheme.

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

void gum::learning::genericBNLearner::enableEpsilon ( )
inlinevirtualinherited

Enable stopping criterion on epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 696 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::enableEpsilon().

696  {
700  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
void enableEpsilon()
Enable stopping criterion on epsilon.

+ Here is the call graph for this function:

void gum::learning::genericBNLearner::enableMaxIter ( )
inlinevirtualinherited

Enable stopping criterion on max iterations.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 779 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::enableMaxIter().

779  {
783  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
void enableMaxIter()
Enable stopping criterion on max iterations.
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm

+ Here is the call graph for this function:

void gum::learning::genericBNLearner::enableMaxTime ( )
inlinevirtualinherited

stopping criterion on timeout If the criterion was disabled it will be enabled

Exceptions
OutOfLowerBoundif timeout<=0.0 timeout is time in second (double).

Implements gum::IApproximationSchemeConfiguration.

Definition at line 828 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::enableMaxTime().

828  {
832  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
void enableMaxTime()
Enable stopping criterion on timeout.

+ Here is the call graph for this function:

void gum::learning::genericBNLearner::enableMinEpsilonRate ( )
inlinevirtualinherited

Enable stopping criterion on epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 738 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::enableMinEpsilonRate().

738  {
742  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void enableMinEpsilonRate()
Enable stopping criterion on epsilon rate.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm

+ Here is the call graph for this function:

double gum::learning::genericBNLearner::epsilon ( ) const
inlinevirtualinherited

Get the value of epsilon.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 681 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm, gum::ApproximationScheme::epsilon(), and GUM_ERROR.

681  {
682  if (__current_algorithm != nullptr)
683  return __current_algorithm->epsilon();
684  else
685  GUM_ERROR(FatalError, "No chosen algorithm for learning");
686  };
const ApproximationScheme * __current_algorithm
double epsilon() const
Returns the value of epsilon.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const Arc arc)
inherited

Definition at line 227 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__constraint_ForbiddenArcs, and gum::learning::StructuralConstraintForbiddenArcs::eraseArc().

Referenced by gum::learning::genericBNLearner::eraseForbiddenArc().

227  {
229  }
void eraseArc(const Arc &arc)
remove a forbidden arc
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 238 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::eraseForbiddenArc().

239  {
240  eraseForbiddenArc(Arc(tail, head));
241  }

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::eraseForbiddenArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 250 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::eraseForbiddenArc(), and gum::learning::genericBNLearner::Database::idFromName().

251  {
252  eraseForbiddenArc(Arc(idFromName(tail), idFromName(head)));
253  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const Arc arc)
inherited

Definition at line 266 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__constraint_MandatoryArcs, and gum::learning::StructuralConstraintMandatoryArcs::eraseArc().

Referenced by gum::learning::genericBNLearner::eraseMandatoryArc().

266  {
268  }
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs
void eraseArc(const Arc &arc)
remove a forbidden arc

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const NodeId  tail,
const NodeId  head 
)
inherited

Definition at line 289 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::eraseMandatoryArc().

290  {
291  eraseMandatoryArc(Arc(tail, head));
292  }

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::eraseMandatoryArc ( const std::string &  tail,
const std::string &  head 
)
inherited

Definition at line 277 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::eraseMandatoryArc(), and gum::learning::genericBNLearner::Database::idFromName().

278  {
279  eraseMandatoryArc(Arc(idFromName(tail), idFromName(head)));
280  }
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name

+ Here is the call graph for this function:

const std::vector< double >& gum::learning::genericBNLearner::history ( ) const
inlinevirtualinherited
Exceptions
OperationNotAllowedif scheme not performed or verbosity=false

Implements gum::IApproximationSchemeConfiguration.

Definition at line 895 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm, GUM_ERROR, and gum::ApproximationScheme::history().

895  {
896  if (__current_algorithm != nullptr)
897  return __current_algorithm->history();
898  else
899  GUM_ERROR(FatalError, "No chosen algorithm for learning");
900  };
const std::vector< double > & history() const
Returns the scheme history.
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

INLINE NodeId gum::learning::genericBNLearner::idFromName ( const std::string &  var_name) const
inherited

returns the node id corresponding to a variable name

Exceptions
MissingVariableInDatabaseif a variable of the BN is not found in the database.

Definition at line 86 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__score_database, and gum::learning::genericBNLearner::Database::idFromName().

86  {
87  return __score_database.idFromName(var_name);
88  }
Database __score_database
the database to be used by the scores and parameter estimators
NodeId idFromName(const std::string &var_name) const
returns the node id corresponding to a variable name

+ Here is the call graph for this function:

bool gum::learning::genericBNLearner::isEnabledEpsilon ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on epsilon is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 704 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm, GUM_ERROR, and gum::ApproximationScheme::isEnabledEpsilon().

704  {
705  if (__current_algorithm != nullptr)
707  else
708  GUM_ERROR(FatalError, "No chosen algorithm for learning");
709  };
const ApproximationScheme * __current_algorithm
bool isEnabledEpsilon() const
Returns true if stopping criterion on epsilon is enabled, false otherwise.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

bool gum::learning::genericBNLearner::isEnabledMaxIter ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on max iterations is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 786 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm, GUM_ERROR, and gum::ApproximationScheme::isEnabledMaxIter().

786  {
787  if (__current_algorithm != nullptr)
789  else
790  GUM_ERROR(FatalError, "No chosen algorithm for learning");
791  };
const ApproximationScheme * __current_algorithm
bool isEnabledMaxIter() const
Returns true if stopping criterion on max iterations is enabled, false otherwise. ...
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

bool gum::learning::genericBNLearner::isEnabledMaxTime ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on timeout is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 835 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm, GUM_ERROR, and gum::ApproximationScheme::isEnabledMaxTime().

835  {
836  if (__current_algorithm != nullptr)
838  else
839  GUM_ERROR(FatalError, "No chosen algorithm for learning");
840  };
bool isEnabledMaxTime() const
Returns true if stopping criterion on timeout is enabled, false otherwise.
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

bool gum::learning::genericBNLearner::isEnabledMinEpsilonRate ( ) const
inlinevirtualinherited
Returns
true if stopping criterion on epsilon rate is enabled, false otherwise

Implements gum::IApproximationSchemeConfiguration.

Definition at line 745 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm, GUM_ERROR, and gum::ApproximationScheme::isEnabledMinEpsilonRate().

745  {
746  if (__current_algorithm != nullptr)
748  else
749  GUM_ERROR(FatalError, "No chosen algorithm for learning");
750  };
bool isEnabledMinEpsilonRate() const
Returns true if stopping criterion on epsilon rate is enabled, false otherwise.
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

INLINE const std::vector< Arc > gum::learning::genericBNLearner::latentVariables ( ) const
inherited

get the list of arcs hiding latent variables

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 182 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__miic_3off2, gum::learning::genericBNLearner::__selected_algo, GUM_ERROR, gum::learning::Miic::latentVariables(), and gum::learning::genericBNLearner::MIIC_THREE_OFF_TWO.

182  {
184  GUM_ERROR(OperationNotAllowed, "Must be using the 3off2 algorithm");
185  }
186  return __miic_3off2.latentVariables();
187  }
AlgoType __selected_algo
the selected learning algorithm
Miic __miic_3off2
the 3off2 algorithm
const std::vector< Arc > latentVariables() const
get the list of arcs hiding latent variables
Definition: Miic.cpp:1006
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

template<typename GUM_SCALAR >
BayesNet< GUM_SCALAR > gum::learning::BNLearner< GUM_SCALAR >::learnBN ( )

learn a Bayes Net from a file (must have read the db before)

DAG gum::learning::genericBNLearner::learnDAG ( )
inherited

learn a structure from a file (must have read the db before)

Definition at line 662 of file genericBNLearner.cpp.

References gum::learning::genericBNLearner::__createApriori(), gum::learning::genericBNLearner::__createScore(), and gum::learning::genericBNLearner::__learnDAG().

662  {
663  // create the score and the apriori
664  __createApriori();
665  __createScore();
666 
667  return __learnDAG();
668  }
void __createScore()
create the score used for learning
DAG __learnDAG()
returns the DAG learnt
void __createApriori()
create the apriori used for learning

+ Here is the call graph for this function:

MixedGraph gum::learning::genericBNLearner::learnMixedStructure ( )
inherited

learn a partial structure from a file (must have read the db before and must have selected miic or 3off2)

Definition at line 652 of file genericBNLearner.cpp.

References gum::learning::genericBNLearner::__miic_3off2, gum::learning::genericBNLearner::__mutual_info, gum::learning::genericBNLearner::__prepare_miic_3off2(), gum::learning::genericBNLearner::__selected_algo, GUM_ERROR, gum::learning::Miic::learnMixedStructure(), and gum::learning::genericBNLearner::MIIC_THREE_OFF_TWO.

652  {
654  GUM_ERROR(OperationNotAllowed, "Must be using the miic/3off2 algorithm");
655  }
656  BNLearnerListener listener(this, __miic_3off2);
657  // create the mixedGraph_constraint_MandatoryArcs.arcs();
658  MixedGraph mgraph = this->__prepare_miic_3off2();
660  }
AlgoType __selected_algo
the selected learning algorithm
MixedGraph learnMixedStructure(CorrectedMutualInformation<> &I, MixedGraph graph)
learns the structure of an Essential Graph
Definition: Miic.cpp:103
Miic __miic_3off2
the 3off2 algorithm
MixedGraph __prepare_miic_3off2()
prepares the initial graph for 3off2 or miic
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

template<typename GUM_SCALAR >
BayesNet< GUM_SCALAR > gum::learning::BNLearner< GUM_SCALAR >::learnParameters ( const DAG dag,
bool  take_into_account_score = true 
)

learns a BN (its parameters) when its structure is known

Parameters
dagthe structure of the Bayesian network
take_into_account_scoreThe dag passed in argument may have been learnt from a structure learning. In this case, if the score used to learn the structure has an implicit apriori (like K2 which has a 1-smoothing apriori), it is important to also take into account this implicit apriori for parameter learning. By default, if a score exists, we will learn parameters by taking into account the apriori specified by methods useAprioriXXX () + the implicit apriori of the score, else we just take into account the apriori specified by useAprioriXXX ()
template<typename GUM_SCALAR >
BayesNet< GUM_SCALAR > gum::learning::BNLearner< GUM_SCALAR >::learnParameters ( const BayesNet< GUM_SCALAR > &  bn,
bool  take_into_account_score = true 
)

learns a BN (its parameters) when its structure is known

Parameters
bnthe structure of the Bayesian network
take_into_account_scoreThe dag passed in argument may have been learnt from a structure learning. In this case, if the score used to learn the structure has an implicit apriori (like K2 which has a 1-smoothing apriori), it is important to also take into account this implicit apriori for parameter learning. By default, if a score exists, we will learn parameters by taking into account the apriori specified by methods useAprioriXXX () + the implicit apriori of the score, else we just take into account the apriori specified by useAprioriXXX ()
Exceptions
MissingVariableInDatabaseif a variable of the BN is not found in the database.
UnknownLabelInDatabaseif a label is found in the databast that do not correpond to the variable.
Size gum::learning::genericBNLearner::maxIter ( ) const
inlinevirtualinherited
Returns
the criterion on number of iterations

Implements gum::IApproximationSchemeConfiguration.

Definition at line 765 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm, GUM_ERROR, and gum::ApproximationScheme::maxIter().

765  {
766  if (__current_algorithm != nullptr)
767  return __current_algorithm->maxIter();
768  else
769  GUM_ERROR(FatalError, "No chosen algorithm for learning");
770  };
const ApproximationScheme * __current_algorithm
Size maxIter() const
Returns the criterion on number of iterations.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

double gum::learning::genericBNLearner::maxTime ( ) const
inlinevirtualinherited

returns the timeout (in seconds)

Implements gum::IApproximationSchemeConfiguration.

Definition at line 807 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm, GUM_ERROR, and gum::ApproximationScheme::maxTime().

807  {
808  if (__current_algorithm != nullptr)
809  return __current_algorithm->maxTime();
810  else
811  GUM_ERROR(FatalError, "No chosen algorithm for learning");
812  };
double maxTime() const
Returns the timeout (in seconds).
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

INLINE std::string gum::IApproximationSchemeConfiguration::messageApproximationScheme ( ) const
inherited

Returns the approximation scheme message.

Returns
Returns the approximation scheme message.

Definition at line 38 of file IApproximationSchemeConfiguration_inl.h.

References gum::IApproximationSchemeConfiguration::Continue, gum::IApproximationSchemeConfiguration::Epsilon, gum::IApproximationSchemeConfiguration::epsilon(), gum::IApproximationSchemeConfiguration::Limit, gum::IApproximationSchemeConfiguration::maxIter(), gum::IApproximationSchemeConfiguration::maxTime(), gum::IApproximationSchemeConfiguration::minEpsilonRate(), gum::IApproximationSchemeConfiguration::Rate, gum::IApproximationSchemeConfiguration::stateApproximationScheme(), gum::IApproximationSchemeConfiguration::Stopped, gum::IApproximationSchemeConfiguration::TimeLimit, and gum::IApproximationSchemeConfiguration::Undefined.

Referenced by gum::ApproximationScheme::_stopScheme(), gum::ApproximationScheme::continueApproximationScheme(), and gum::credal::InferenceEngine< GUM_SCALAR >::getApproximationSchemeMsg().

38  {
39  std::stringstream s;
40 
41  switch (stateApproximationScheme()) {
42  case ApproximationSchemeSTATE::Continue: s << "in progress"; break;
43 
45  s << "stopped with epsilon=" << epsilon();
46  break;
47 
49  s << "stopped with rate=" << minEpsilonRate();
50  break;
51 
53  s << "stopped with max iteration=" << maxIter();
54  break;
55 
57  s << "stopped with timeout=" << maxTime();
58  break;
59 
60  case ApproximationSchemeSTATE::Stopped: s << "stopped on request"; break;
61 
62  case ApproximationSchemeSTATE::Undefined: s << "undefined state"; break;
63  };
64 
65  return s.str();
66  }
virtual double epsilon() const =0
Returns the value of epsilon.
virtual ApproximationSchemeSTATE stateApproximationScheme() const =0
Returns the approximation scheme state.
virtual double maxTime() const =0
Returns the timeout (in seconds).
virtual Size maxIter() const =0
Returns the criterion on number of iterations.
virtual double minEpsilonRate() const =0
Returns the value of the minimal epsilon rate.

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

double gum::learning::genericBNLearner::minEpsilonRate ( ) const
inlinevirtualinherited

Get the value of the minimal epsilon rate.

Implements gum::IApproximationSchemeConfiguration.

Definition at line 724 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm, GUM_ERROR, and gum::ApproximationScheme::minEpsilonRate().

724  {
725  if (__current_algorithm != nullptr)
727  else
728  GUM_ERROR(FatalError, "No chosen algorithm for learning");
729  };
double minEpsilonRate() const
Returns the value of the minimal epsilon rate.
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

INLINE const std::vector< Size > & gum::learning::genericBNLearner::modalities ( )
noexceptinherited

returns the names of the variables in the database

Definition at line 356 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__score_database, and gum::learning::genericBNLearner::Database::modalities().

356  {
357  return __score_database.modalities();
358  }
Database __score_database
the database to be used by the scores and parameter estimators
std::vector< Size > & modalities() noexcept
returns the modalities of the variables

+ Here is the call graph for this function:

INLINE const std::string & gum::learning::genericBNLearner::nameFromId ( NodeId  id) const
inherited

returns the variable name corresponding to a given node id

Definition at line 91 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__score_database, and gum::learning::genericBNLearner::Database::nameFromId().

91  {
92  return __score_database.nameFromId(id);
93  }
Database __score_database
the database to be used by the scores and parameter estimators
const std::string & nameFromId(NodeId id) const
returns the variable name corresponding to a given node id

+ Here is the call graph for this function:

INLINE const std::vector< std::string > & gum::learning::genericBNLearner::names ( ) const
inherited

returns the names of the variables in the database

Definition at line 351 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__score_database, and gum::learning::genericBNLearner::Database::names().

351  {
352  return __score_database.names();
353  }
Database __score_database
the database to be used by the scores and parameter estimators
const std::vector< std::string > & names() const noexcept
returns the names of the variables in the database

+ Here is the call graph for this function:

Size gum::learning::genericBNLearner::nbrIterations ( ) const
inlinevirtualinherited
Exceptions
OperationNotAllowedif scheme not performed

Implements gum::IApproximationSchemeConfiguration.

Definition at line 887 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm, GUM_ERROR, and gum::ApproximationScheme::nbrIterations().

887  {
888  if (__current_algorithm != nullptr)
890  else
891  GUM_ERROR(FatalError, "No chosen algorithm for learning");
892  };
const ApproximationScheme * __current_algorithm
Size nbrIterations() const
Returns the number of iterations.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

template<typename GUM_SCALAR >
BNLearner& gum::learning::BNLearner< GUM_SCALAR >::operator= ( const BNLearner< GUM_SCALAR > &  )

copy operator

template<typename GUM_SCALAR >
BNLearner& gum::learning::BNLearner< GUM_SCALAR >::operator= ( BNLearner< GUM_SCALAR > &&  )

move operator

Size gum::learning::genericBNLearner::periodSize ( ) const
inlinevirtualinherited

how many samples between 2 stopping isEnableds

Exceptions
OutOfLowerBoundif p<1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 852 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm, GUM_ERROR, and gum::ApproximationScheme::periodSize().

852  {
853  if (__current_algorithm != nullptr)
855  else
856  GUM_ERROR(FatalError, "No chosen algorithm for learning");
857  };
const ApproximationScheme * __current_algorithm
Size periodSize() const
Returns the period size.
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::setAprioriWeight ( double  weight)
inherited

sets the apriori weight

Definition at line 301 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__apriori_weight, gum::learning::genericBNLearner::__checkScoreAprioriCompatibility(), and GUM_ERROR.

Referenced by gum::learning::genericBNLearner::useAprioriSmoothing().

301  {
302  if (weight < 0) {
303  GUM_ERROR(OutOfBounds, "the weight of the apriori must be positive");
304  }
305 
306  __apriori_weight = weight;
308  }
bool __checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
double __apriori_weight
the weight of the apriori
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

INLINE void gum::learning::genericBNLearner::setCurrentApproximationScheme ( const ApproximationScheme approximationScheme)
inlineinherited

{@ /// distribute signals

Definition at line 646 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm.

Referenced by gum::learning::BNLearnerListener::BNLearnerListener(), gum::learning::genericBNLearner::distributeProgress(), and gum::learning::genericBNLearner::distributeStop().

647  {
648  __current_algorithm = approximationScheme;
649  }
const ApproximationScheme * __current_algorithm

+ Here is the caller graph for this function:

void gum::learning::genericBNLearner::setEpsilon ( double  eps)
inlinevirtualinherited

Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)| If the criterion was disabled it will be enabled.

Exceptions
OutOfLowerBoundif eps<0

Implements gum::IApproximationSchemeConfiguration.

Definition at line 674 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setEpsilon().

674  {
678  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
void setEpsilon(double eps)
Given that we approximate f(t), stopping criterion on |f(t+1)-f(t)|.

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::setForbiddenArcs ( const ArcSet set)
inherited

assign a set of forbidden arcs

Definition at line 217 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__constraint_ForbiddenArcs, and gum::learning::StructuralConstraintForbiddenArcs::setArcs().

217  {
219  }
void setArcs(const ArcSet &set)
assign a set of forbidden arcs
StructuralConstraintForbiddenArcs __constraint_ForbiddenArcs
the constraint on forbidden arcs

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::setInitialDAG ( const DAG dag)
inherited

sets an initial DAG structure

Definition at line 96 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__initial_dag.

96  {
97  __initial_dag = dag;
98  }
DAG __initial_dag
an initial DAG given to learners
INLINE void gum::learning::genericBNLearner::setMandatoryArcs ( const ArcSet set)
inherited

assign a set of forbidden arcs

Definition at line 256 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__constraint_MandatoryArcs, and gum::learning::StructuralConstraintMandatoryArcs::setArcs().

256  {
258  }
void setArcs(const ArcSet &set)
assign a set of forbidden arcs
StructuralConstraintMandatoryArcs __constraint_MandatoryArcs
the constraint on forbidden arcs

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::setMaxIndegree ( Size  max_indegree)
inherited

sets the max indegree

Definition at line 137 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__constraint_Indegree, and gum::learning::StructuralConstraintIndegree::setMaxIndegree().

137  {
139  }
void setMaxIndegree(Size max_indegree, bool update_all_node=false)
resets the default max indegree and possibly updates the indegree of all nodes
StructuralConstraintIndegree __constraint_Indegree
the constraint for indegrees

+ Here is the call graph for this function:

void gum::learning::genericBNLearner::setMaxIter ( Size  max)
inlinevirtualinherited

stopping criterion on number of iterationsIf the criterion was disabled it will be enabled

Parameters
maxThe maximum number of iterations
Exceptions
OutOfLowerBoundif max<=1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 758 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setMaxIter().

758  {
762  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm
void setMaxIter(Size max)
Stopping criterion on number of iterations.

+ Here is the call graph for this function:

void gum::learning::genericBNLearner::setMaxTime ( double  timeout)
inlinevirtualinherited

stopping criterion on timeout If the criterion was disabled it will be enabled

Exceptions
OutOfLowerBoundif timeout<=0.0 timeout is time in second (double).

Implements gum::IApproximationSchemeConfiguration.

Definition at line 800 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setMaxTime().

800  {
804  }
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setMaxTime(double timeout)
Stopping criterion on timeout.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm

+ Here is the call graph for this function:

void gum::learning::genericBNLearner::setMinEpsilonRate ( double  rate)
inlinevirtualinherited

Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|) If the criterion was disabled it will be enabled.

Exceptions
OutOfLowerBoundif rate<0

Implements gum::IApproximationSchemeConfiguration.

Definition at line 717 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setMinEpsilonRate().

717  {
721  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setMinEpsilonRate(double rate)
Given that we approximate f(t), stopping criterion on d/dt(|f(t+1)-f(t)|).
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm

+ Here is the call graph for this function:

void gum::learning::genericBNLearner::setPeriodSize ( Size  p)
inlinevirtualinherited

how many samples between 2 stopping isEnableds

Exceptions
OutOfLowerBoundif p<1

Implements gum::IApproximationSchemeConfiguration.

Definition at line 846 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setPeriodSize().

846  {
850  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setPeriodSize(Size p)
How many samples between two stopping is enable.
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::setSliceOrder ( const NodeProperty< NodeId > &  slice_order)
inherited

sets a partial order on the nodes

Definition at line 296 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__constraint_SliceOrder.

296  {
297  __constraint_SliceOrder = StructuralConstraintSliceOrder(slice_order);
298  }
StructuralConstraintSliceOrder __constraint_SliceOrder
the constraint for 2TBNs
void gum::learning::genericBNLearner::setVerbosity ( bool  v)
inlinevirtualinherited

verbosity

Implements gum::IApproximationSchemeConfiguration.

Definition at line 862 of file genericBNLearner.h.

References gum::learning::K2::approximationScheme(), and gum::ApproximationScheme::setVerbosity().

862  {
866  };
ApproximationScheme & approximationScheme()
returns the approximation policy of the learning algorithm
void setVerbosity(bool v)
Set the verbosity on (true) or off (false).
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
GreedyHillClimbing __greedy_hill_climbing
the greedy hill climbing algorithm

+ Here is the call graph for this function:

ApproximationSchemeSTATE gum::learning::genericBNLearner::stateApproximationScheme ( ) const
inlinevirtualinherited

history

Implements gum::IApproximationSchemeConfiguration.

Definition at line 879 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm, GUM_ERROR, and gum::ApproximationScheme::stateApproximationScheme().

879  {
880  if (__current_algorithm != nullptr)
882  else
883  GUM_ERROR(FatalError, "No chosen algorithm for learning");
884  };
ApproximationSchemeSTATE stateApproximationScheme() const
Returns the approximation scheme state.
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::use3off2 ( )
noexceptinherited

indicate that we wish to use 3off2

Definition at line 142 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__miic_3off2, gum::learning::genericBNLearner::__selected_algo, gum::learning::genericBNLearner::MIIC_THREE_OFF_TWO, and gum::learning::Miic::set3off2Behaviour().

142  {
145  }
AlgoType __selected_algo
the selected learning algorithm
void set3off2Behaviour()
Sets the orientation phase to follow the one of the 3off2 algorithm.
Definition: Miic.cpp:1031
Miic __miic_3off2
the 3off2 algorithm

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useAprioriDirichlet ( const std::string &  filename)
inherited

use the Dirichlet apriori

Definition at line 327 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__apriori_dbname, gum::learning::genericBNLearner::__apriori_type, gum::learning::genericBNLearner::__checkScoreAprioriCompatibility(), and gum::learning::genericBNLearner::DIRICHLET_FROM_DATABASE.

327  {
328  __apriori_dbname = filename;
331  }
bool __checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
AprioriType __apriori_type
the a priori selected for the score and parameters
std::string __apriori_dbname
the filename for the Dirichlet a priori, if any

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useAprioriSmoothing ( double  weight = -1)
inherited

use the apriori smoothing

Parameters
weightpass in argument a weight if you wish to assign a weight to the smoothing, else the current weight of the genericBNLearner will be used.

Definition at line 317 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__apriori_type, gum::learning::genericBNLearner::__checkScoreAprioriCompatibility(), gum::learning::genericBNLearner::setAprioriWeight(), and gum::learning::genericBNLearner::SMOOTHING.

317  {
319 
320  if (weight >= 0) { setAprioriWeight(weight); }
321 
323  }
bool __checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
AprioriType __apriori_type
the a priori selected for the score and parameters
void setAprioriWeight(double weight)
sets the apriori weight

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useGreedyHillClimbing ( )
noexceptinherited

indicate that we wish to use a greedy hill climbing algorithm

Definition at line 203 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__selected_algo, and gum::learning::genericBNLearner::GREEDY_HILL_CLIMBING.

INLINE void gum::learning::genericBNLearner::useK2 ( const Sequence< NodeId > &  order)
noexceptinherited

indicate that we wish to use K2

Definition at line 190 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__K2, gum::learning::genericBNLearner::__selected_algo, gum::learning::genericBNLearner::K2, and gum::learning::K2::setOrder().

190  {
192  __K2.setOrder(order);
193  }
AlgoType __selected_algo
the selected learning algorithm
void setOrder(const Sequence< NodeId > &order)
sets the order on the variables

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useK2 ( const std::vector< NodeId > &  order)
noexceptinherited

indicate that we wish to use K2

Definition at line 197 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__K2, gum::learning::genericBNLearner::__selected_algo, gum::learning::genericBNLearner::K2, and gum::learning::K2::setOrder().

197  {
199  __K2.setOrder(order);
200  }
AlgoType __selected_algo
the selected learning algorithm
void setOrder(const Sequence< NodeId > &order)
sets the order on the variables

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useLocalSearchWithTabuList ( Size  tabu_size = 100,
Size  nb_decrease = 2 
)
noexceptinherited

indicate that we wish to use a local search with tabu list

Parameters
tabu_sizeindicate the size of the tabu list
nb_decreaseindicate the max number of changes decreasing the score consecutively that we allow to apply

Definition at line 209 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__constraint_TabuList, gum::learning::genericBNLearner::__local_search_with_tabu_list, gum::learning::genericBNLearner::__selected_algo, gum::learning::genericBNLearner::LOCAL_SEARCH_WITH_TABU_LIST, gum::learning::LocalSearchWithTabuList::setMaxNbDecreasingChanges(), and gum::learning::StructuralConstraintTabuList::setTabuListSize().

210  {
214  }
AlgoType __selected_algo
the selected learning algorithm
void setTabuListSize(Size new_size)
sets the size of the tabu list
LocalSearchWithTabuList __local_search_with_tabu_list
the local search with tabu list algorithm
StructuralConstraintTabuList __constraint_TabuList
the constraint for tabu lists
void setMaxNbDecreasingChanges(Size nb)
set the max number of changes decreasing the score that we allow to apply

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useMDL ( )
inherited

indicate that we wish to use the MDL correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 163 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__mutual_info, gum::learning::genericBNLearner::__score_database, gum::learning::genericBNLearner::__selected_algo, GUM_ERROR, gum::learning::genericBNLearner::MIIC_THREE_OFF_TWO, gum::learning::genericBNLearner::Database::modalities(), gum::learning::genericBNLearner::Database::parser(), and gum::learning::CorrectedMutualInformation< IdSetAlloc, CountAlloc >::useMDL().

163  {
165  GUM_ERROR(OperationNotAllowed, "Must be using the 3off2 algorithm");
166  }
167  __mutual_info = new CorrectedMutualInformation<>(
170  }
AlgoType __selected_algo
the selected learning algorithm
Database __score_database
the database to be used by the scores and parameter estimators
std::vector< Size > & modalities() noexcept
returns the modalities of the variables
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
DBRowGeneratorParser & parser()
returns the parser for the database
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useMIIC ( )
noexceptinherited

indicate that we wish to use MIIC

Definition at line 148 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__miic_3off2, gum::learning::genericBNLearner::__selected_algo, gum::learning::genericBNLearner::MIIC_THREE_OFF_TWO, and gum::learning::Miic::setMiicBehaviour().

148  {
151  }
AlgoType __selected_algo
the selected learning algorithm
Miic __miic_3off2
the 3off2 algorithm
void setMiicBehaviour()
Sets the orientation phase to follow the one of the MIIC algorithm.
Definition: Miic.cpp:1030

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useNML ( )
inherited

indicate that we wish to use the NML correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 154 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__mutual_info, gum::learning::genericBNLearner::__score_database, gum::learning::genericBNLearner::__selected_algo, GUM_ERROR, gum::learning::genericBNLearner::MIIC_THREE_OFF_TWO, gum::learning::genericBNLearner::Database::modalities(), gum::learning::genericBNLearner::Database::parser(), and gum::learning::CorrectedMutualInformation< IdSetAlloc, CountAlloc >::useNML().

Referenced by gum::learning::genericBNLearner::__prepare_miic_3off2().

154  {
156  GUM_ERROR(OperationNotAllowed, "Must be using the 3off2 algorithm");
157  }
158  __mutual_info = new CorrectedMutualInformation<>(
161  }
AlgoType __selected_algo
the selected learning algorithm
Database __score_database
the database to be used by the scores and parameter estimators
std::vector< Size > & modalities() noexcept
returns the modalities of the variables
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
DBRowGeneratorParser & parser()
returns the parser for the database
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

+ Here is the caller graph for this function:

INLINE void gum::learning::genericBNLearner::useNoApriori ( )
inherited

use no apriori

Definition at line 311 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__apriori_type, gum::learning::genericBNLearner::__checkScoreAprioriCompatibility(), and gum::learning::genericBNLearner::NO_APRIORI.

311  {
314  }
bool __checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
AprioriType __apriori_type
the a priori selected for the score and parameters

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useNoCorr ( )
inherited

indicate that we wish to use the NoCorr correction for 3off2

Exceptions
OperationNotAllowedwhen 3off2 is not the selected algorithm

Definition at line 172 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__mutual_info, gum::learning::genericBNLearner::__score_database, gum::learning::genericBNLearner::__selected_algo, GUM_ERROR, gum::learning::genericBNLearner::MIIC_THREE_OFF_TWO, gum::learning::genericBNLearner::Database::modalities(), gum::learning::genericBNLearner::Database::parser(), and gum::learning::CorrectedMutualInformation< IdSetAlloc, CountAlloc >::useNoCorr().

172  {
174  GUM_ERROR(OperationNotAllowed, "Must be using the 3off2 algorithm");
175  }
176  __mutual_info = new CorrectedMutualInformation<>(
179  }
AlgoType __selected_algo
the selected learning algorithm
Database __score_database
the database to be used by the scores and parameter estimators
std::vector< Size > & modalities() noexcept
returns the modalities of the variables
CorrectedMutualInformation * __mutual_info
the selected correction for 3off2 and miic
DBRowGeneratorParser & parser()
returns the parser for the database
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useScoreAIC ( )
inherited

indicate that we wish to use an AIC score

Definition at line 101 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__checkScoreAprioriCompatibility(), gum::learning::genericBNLearner::__score_type, and gum::learning::genericBNLearner::AIC.

101  {
104  }
bool __checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useScoreBD ( )
inherited

indicate that we wish to use a BD score

Definition at line 107 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__checkScoreAprioriCompatibility(), gum::learning::genericBNLearner::__score_type, and gum::learning::genericBNLearner::BD.

107  {
110  }
bool __checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useScoreBDeu ( )
inherited

indicate that we wish to use a BDeu score

Definition at line 113 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__checkScoreAprioriCompatibility(), gum::learning::genericBNLearner::__score_type, and gum::learning::genericBNLearner::BDeu.

113  {
116  }
bool __checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useScoreBIC ( )
inherited

indicate that we wish to use a BIC score

Definition at line 119 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__checkScoreAprioriCompatibility(), gum::learning::genericBNLearner::__score_type, and gum::learning::genericBNLearner::BIC.

119  {
122  }
bool __checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useScoreK2 ( )
inherited

indicate that we wish to use a K2 score

Definition at line 125 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__checkScoreAprioriCompatibility(), gum::learning::genericBNLearner::__score_type, and gum::learning::genericBNLearner::K2.

125  {
128  }
bool __checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning

+ Here is the call graph for this function:

INLINE void gum::learning::genericBNLearner::useScoreLog2Likelihood ( )
inherited

indicate that we wish to use a Log2Likelihood score

Definition at line 131 of file genericBNLearner_inl.h.

References gum::learning::genericBNLearner::__checkScoreAprioriCompatibility(), gum::learning::genericBNLearner::__score_type, and gum::learning::genericBNLearner::LOG2LIKELIHOOD.

131  {
134  }
bool __checkScoreAprioriCompatibility()
checks whether the current score and apriori are compatible
ScoreType __score_type
the score selected for learning

+ Here is the call graph for this function:

bool gum::learning::genericBNLearner::verbosity ( ) const
inlinevirtualinherited

verbosity

Implements gum::IApproximationSchemeConfiguration.

Definition at line 868 of file genericBNLearner.h.

References gum::learning::genericBNLearner::__current_algorithm, GUM_ERROR, and gum::ApproximationScheme::verbosity().

868  {
869  if (__current_algorithm != nullptr)
870  return __current_algorithm->verbosity();
871  else
872  GUM_ERROR(FatalError, "No chosen algorithm for learning");
873  };
bool verbosity() const
Returns true if verbosity is enabled.
const ApproximationScheme * __current_algorithm
#define GUM_ERROR(type, msg)
Definition: exceptions.h:66

+ Here is the call graph for this function:

Member Data Documentation

Database* gum::learning::genericBNLearner::__apriori_database {nullptr}
protectedinherited
std::string gum::learning::genericBNLearner::__apriori_dbname
protectedinherited
double gum::learning::genericBNLearner::__apriori_weight {1.0f}
protectedinherited
StructuralConstraintIndegree gum::learning::genericBNLearner::__constraint_Indegree
protectedinherited
StructuralConstraintSliceOrder gum::learning::genericBNLearner::__constraint_SliceOrder
protectedinherited
StructuralConstraintTabuList gum::learning::genericBNLearner::__constraint_TabuList
protectedinherited
GreedyHillClimbing gum::learning::genericBNLearner::__greedy_hill_climbing
protectedinherited

the greedy hill climbing algorithm

Definition at line 561 of file genericBNLearner.h.

Referenced by gum::learning::genericBNLearner::__learnDAG(), and gum::learning::genericBNLearner::operator=().

DAG gum::learning::genericBNLearner::__initial_dag
protectedinherited
K2 gum::learning::genericBNLearner::__K2
protectedinherited
LocalSearchWithTabuList gum::learning::genericBNLearner::__local_search_with_tabu_list
protectedinherited
bool gum::learning::genericBNLearner::__modalities_parse_db {false}
protectedinherited

indicates whether we shall parse the database to update __user_modalities

Definition at line 574 of file genericBNLearner.h.

Referenced by gum::learning::genericBNLearner::operator=().

ParamEstimator* gum::learning::genericBNLearner::__param_estimator {nullptr}
protectedinherited
ParamEstimatorType gum::learning::genericBNLearner::__param_estimator_type {ParamEstimatorType::ML}
protectedinherited

the type of the parameter estimator

Definition at line 519 of file genericBNLearner.h.

Referenced by gum::learning::genericBNLearner::__createParamEstimator(), and gum::learning::genericBNLearner::operator=().

NodeProperty< Sequence< std::string > > gum::learning::genericBNLearner::__user_modalities
protectedinherited

indicates the values the user specified for the translators

Definition at line 570 of file genericBNLearner.h.

Referenced by gum::learning::genericBNLearner::__createApriori(), and gum::learning::genericBNLearner::operator=().

Signaler1< std::string > gum::IApproximationSchemeConfiguration::onStop
inherited

Criteria messageApproximationScheme.

Definition at line 61 of file IApproximationSchemeConfiguration.h.

Referenced by gum::ApproximationScheme::_stopScheme(), and gum::learning::genericBNLearner::distributeStop().


The documentation for this class was generated from the following file: