Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adding BayesNet #3

Open
wants to merge 6 commits into
base: master
Choose a base branch
from
Open

adding BayesNet #3

wants to merge 6 commits into from

Conversation

jac2130
Copy link

@jac2130 jac2130 commented Oct 3, 2013

Dear professor Downey,

I thought that it would be useful to have a class called "BayesNet," which inherits from both, a "DiGraph" class, which is a directed graph in NetworkX and from "Joint", which is your class for joint distributions. Thus, I quickly cooked up such a class, which currently handles binary variables and is encoded according to Pearl's "Noisy-OR" encoding (Pearl 1988). The idea here is that this is a way to quickly build a model of causal relationships between k variables in some system of interest. In particular, I am interested in modeling how people might reason differently about such relationships and how differently structured causal models might be compared in the light of data (I'm following the ideas of cognitive scientists Tom Griffiths and Joshua Tennenbaum). Right now, every independent variable has the same marginal probability 'p', of taking on the value 1, and every caused variable is linked to every one of its causes by the same causal effect parameter "causal_effect". As this is just a quick first try at coding such a thing up, it isn't yet very clean and doesn't yet handle a whole lot (variables are restricted to be binary, independent variables all have the same probability of being 1 and causal effects are all the same). For example, it would be nice to have the possibility of negative causation; what if someone believes that some variables have preventative effects on other variables? Just a beginning, but I'm continuously working on it. If you have any ideas, I'd be greatful!

Johannes

@AllenDowney
Copy link
Owner

Hi Johannes,

This looks very interesting. Thanks for sending it along. I would have
liked to get a Bayesian network into the book, but I didn't have a good
case study to motivate it. And I had to stop somewhere!

But I will keep an eye on your project and maybe we can talk about
incorporating it at some point.

Thanks very much!
Allen

On Thu, Oct 3, 2013 at 1:31 PM, Johannes Castner
[email protected]:

Dear professor Downey,

I thought that it would be useful to have a class called "BayesNet," which
inherits from both, a "DiGraph" class, which is a directed graph in
NetworkX and from "Joint", which is your class for joint distributions.
Thus, I quickly cooked up such a class, which currently handles binary
variables and is encoded according to Pearl's "Noisy-OR" encoding (Pearl
1988). The idea here is that this is a way to quickly build a model of
causal relationships between k variables in some system of interest. In
particular, I am interested in modeling how people might reason differently
about such relationships and how differently structured causal models might
be compared in the light of data (I'm following the ideas of cognitive
scientists Tom Griffiths and Joshua Tennenbaum). Right now, every
independent variable has the same marginal probability 'p', of taking on
the value 1, and every caused variable is linked to every one of its causes
by the same causal effect parameter "causal_effect" . As this is just a
quick first try at coding such a thing up, it isn't yet very clean and
doesn't yet handle a whole lot (variables are restricted to be binary,
independent variables all have the same probability of being 1 and causal
effects are all the same). For example, it would be nice to have the
possibility of negative causation; what if someone believes that some
variables have preventative effects on other variables? Just a beginning,
but I'm continuously working on it. If you have any ideas, I'd be greatful!

Johannes

You can merge this Pull Request by running

git pull https://github.com/jac2130/ThinkBayes master

Or view, comment on, or merge it at:

#3
Commit Summary

  • adding BayesNet

File Changes

Patch Links:

@jac2130
Copy link
Author

jac2130 commented Oct 3, 2013

That would be great! I'm working on a paper right now that uses
Bayesian networks (and this code); I'll send it to you when I have a
draft. I've really enjoyed your books and I even used "Think
Complexity" in a course on Complexity Science that I co-taught with my
friend and collegue James Rising at Columbia University last semester.
I learned a lot from your work; thank you very much!

Johannes

On Thu, Oct 3, 2013 at 1:50 PM, AllenDowney [email protected] wrote:

Hi Johannes,

This looks very interesting. Thanks for sending it along. I would have
liked to get a Bayesian network into the book, but I didn't have a good
case study to motivate it. And I had to stop somewhere!

But I will keep an eye on your project and maybe we can talk about
incorporating it at some point.

Thanks very much!
Allen

On Thu, Oct 3, 2013 at 1:31 PM, Johannes Castner
[email protected]:

Dear professor Downey,

I thought that it would be useful to have a class called "BayesNet," which
inherits from both, a "DiGraph" class, which is a directed graph in
NetworkX and from "Joint", which is your class for joint distributions.
Thus, I quickly cooked up such a class, which currently handles binary
variables and is encoded according to Pearl's "Noisy-OR" encoding (Pearl
1988). The idea here is that this is a way to quickly build a model of
causal relationships between k variables in some system of interest. In
particular, I am interested in modeling how people might reason
differently
about such relationships and how differently structured causal models
might
be compared in the light of data (I'm following the ideas of cognitive
scientists Tom Griffiths and Joshua Tennenbaum). Right now, every
independent variable has the same marginal probability 'p', of taking on
the value 1, and every caused variable is linked to every one of its
causes
by the same causal effect parameter "causal_effect" . As this is just a
quick first try at coding such a thing up, it isn't yet very clean and
doesn't yet handle a whole lot (variables are restricted to be binary,
independent variables all have the same probability of being 1 and causal
effects are all the same). For example, it would be nice to have the
possibility of negative causation; what if someone believes that some
variables have preventative effects on other variables? Just a beginning,
but I'm continuously working on it. If you have any ideas, I'd be
greatful!

Johannes

You can merge this Pull Request by running

git pull https://github.com/jac2130/ThinkBayes master

Or view, comment on, or merge it at:

#3
Commit Summary

  • adding BayesNet

File Changes

Patch Links:


Reply to this email directly or view it on GitHub.

Johannes


"Peace cannot be kept by force; it can only be achieved by understanding."

  • Albert Einstein

@jac2130
Copy link
Author

jac2130 commented Oct 4, 2013

Today I generalized a few things: it now works with negatve causation and with heterogenous causal effects (in the form of edge weights).

Johannes

jac2130 added 4 commits October 4, 2013 17:02
…uses of other variables. However, right now, those independent variables have to be added, using add_nodes, before edges are added, using add_edges_from. This is not very elegant, but it works for now.
…into the BayesNet object. The default is a beta distribution with alpha=beta=2.
@johndavidmiller
Copy link

I really like the idea of a Think book on Bayes nets, and probabilistic graph models, in general. Perhaps the most expedient would be some sort of collaboration with David Barber, author of "the pink book" on Bayesian machine learning. That's a fantastic book, with a nice companion toolkit, but alas his code is in MATLAB, which was an unfortunate choice, IMHO.

Maybe your book could be titled, "Think Graph." And if you need reviewers or contributors, I'd love to help.

-- jdm
John David Miller
Principal Research Data Scientist
Intel Corporation

@AllenDowney
Copy link
Owner

@johndavidmiller Thanks for these comments. This is definitely something I would like to get to some day!

@jac2130 I have not forgotten about this issue, but I have been working on other things. I am teaching my Bayesian stats class this fall, which might create the opportunity for me to get back to this. I am looking forward to the possibility of bringing in this capability.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants