I am a graduate student in the Di2Ag laboratory at Dartmouth College, and would love to collaborate on this project with anyone who has an interest in graphical models - Send me an email at [email protected]. If you're a researcher or student and want to use this module, I am happy to give an overview of the code/functionality or answer any questions.
This code base is essentially the same as the "neuroBN" package found in www.github.com/ncullen93/neuroBN. I maintain two separate repositories because I expect the two projects to diverge sharply in the near future.
For an up-to-date list of issues, go to the "issues" tab in this repository. Below is an updated list of features, along with information on usage/examples:
- Exact Marginal Inference - Sum-Product Variable Elimination - Clique Tree Message Passing - Approximate Marginal Inference - Forward Sampling - Likelihood Weighted Sampling - Gibbs (MCMC) Sampling - Loopy Belief Propagation - Exact MAP Inference - Max-Product Variable Elimination - Integer Linear Programming - Approximate MAP Inference - LP Relaxation - Algorithms - PC - Grow-Shrink - IAMB/Lambda-IAMB/Fast-IAMB - Independence Tests - Marginal Mutual Information - Conditional Mutual Information - Pearson Chi-Square - Algorithms - Greedy Hill Climbing - Tabu Search - Random Restarts - Scoring Functions - BIC/AIC/MDL - BDe/BDeu/K2 - Naive Bayes - Tree-Augmented Naive Bayes - Chow-Liu - MMPC - MMHC - GOBNILP Solver - Maximum Likelihood Estimation - Dirichlet-Multinomial Estimation - Naive Bayes - Tree-Augmented Naive Bayes - General DAG - Empty/Tree/Polytree/Forest - General DAG - Structure-Based Distance Metrics - Missing Edges - Extra Edges - Incorrect Edge Orientation - Hamming Distance - Parameter-Based Distance Metrics - KL-Divergence and JS-Divergence - Manhattan and Euclidean - Hellinger - Minkowski - Determine Class Equivalence - Discretize continuous data - Orient a PDAG - Generate random sample dataset from a BN - Markov Blanket operations This package includes a number of examples to help users get acquainted with the intuitive syntax and functionality of pyBN. For an updated list of examples, check out the collection of ipython notebooks in the "examples" folder located in the master directory.Here is a list of current examples:
- ReadWrite : an introduction to reading (writing) BayesNet object from (to) files, along with an overview of the attributes and data structures inherit to BayesNet objects.
- Drawing : an introduction to the drawing/plotting capabilities of pyBN with both small and large Bayesian networks.
- FactorOperations : an introduction to the Factor class, an exploration of the numerous attributes belonging to a Factor in pyBN, an overview of every Factor operation function at the users' hands, and a short discussion of what makes Factor operations so fast and efficient in pyBN.
- Click "Download ZIP" button towards the upper right corner of the page.
- Unpack the ZIP file wherever you want on your local machine. You should now have a folder called "pyBN-master"
- In your python terminal, change directories to be IN pyBN-master. Typing "ls" should show you "data", "examples" and "pyBN" folders. Stay in the "pyBN-master" directory for now!
- In your python terminal, simply type "from pyBN import ". This will load all of the module's functions, classes, etc.
- You are now free to use the package! Perhaps you want to start by creating a BayesNet object using "bn = BayesNet()" and so on.