-
Notifications
You must be signed in to change notification settings - Fork 0
/
Copy pathArgument_mapping.page
31 lines (18 loc) · 1.78 KB
/
Argument_mapping.page
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
---
title: Argument mapping
format: markdown
categories: Cause_prioritization Cause_areas
...
TODO describe problem (people disagree about important things, don't change their minds often, aren't good at crossing [inferential distances](http://wiki.lesswrong.com/wiki/Inferential_distance) e.g. see [The Hanson-Yudkowsky AI-Foom Debate](http://wiki.lesswrong.com/wiki/The_Hanson-Yudkowsky_AI-Foom_Debate))
May be a good complement to [Forecasting]() methods so that forecasters can update.
Check out argument mapping software.
See [Wikipedia page on argument mapping](https://en.wikipedia.org/wiki/Argument_map)
[Debate tools](http://wiki.lesswrong.com/wiki/Debate_tools)
[Ought](https://ought.org/) is a new company with a [team](https://ought.org/team) of EA or EA-adjacent people.
"Research on the underlying models and methods he uses to resolve disagreements between individuals with large inferential gaps. He will develop lessons to transfer these to test students, and see what skills are transferred and what skills still lack."[^salvatier] I haven't seen anything public about this so not sure what's going on.
List people who are interested in this?
# See also
- [Better explanations]()
# External links
- Wei Dai has [some](http://lesswrong.com/lw/1gg/agree_retort_or_ignore_a_post_from_the_future/ "Wei Dai. “Agree, Retort, or Ignore? A Post From the Future”. LessWrong. Retrieved February 17, 2018.") [ideas](http://lesswrong.com/lw/5/issues_bugs_and_requested_features/11mr "“Wei_Dai comments on Issues, Bugs, and Requested Features”. LessWrong. Retrieved February 17, 2018.")
[^salvatier]: [“Effective Altruism Grants donations made to John Salvatier”](https://donations.vipulnaik.com/donorDonee.php?donor=Effective+Altruism+Grants&donee=John+Salvatier). Retrieved February 17, 2018.