-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
- Loading branch information
Showing
1,017 changed files
with
1,020 additions
and
1,015 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,5 +1,5 @@ | ||
<!DOCTYPE html><html lang="zh-CN"><head><meta charset="utf-8"><meta name="X-UA-Compatible" content="IE=edge"><meta name="author" content="DSST/NIMH"><title>Abeles et al. (2023) · OpenCogData</title><meta name="description" content="People show vast variability in skill performance and learning. What determines a person’s individual performance and learning ability? In this study "><meta name="keywords"><meta content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=0" name="viewport"><meta content="yes" name="apple-mobile-web-app-capable"><meta content="black" name="apple-mobile-web-app-status-bar-style"><meta content="telephone=no" name="format-detection"><meta name="renderer" content="webkit"><link rel="shortcut icon" type="image/x-icon" href="/images/favicon.webp"><link rel="stylesheet" href="/OpenCogData/css/style.css"><link rel="stylesheet" href="/OpenCogData/css/blog_basic.css"><link rel="stylesheet" href="/OpenCogData/css/font-awesome.min.css"><link rel="stylesheet" href="/OpenCogData/css/insight.css"><link rel="stylesheet" href="/OpenCogData/css/search.css"><link rel="alternate" type="application/atom+xml" title="ATOM 1.0" href="/atom.xml"><script src="/OpenCogData/js/jquery.js"></script><!-- Global site tag (gtag.js) - Google Analytics--><script async src="https://www.googletagmanager.com/gtag/js?id=G-PTJE4Z001J"></script><script>window.dataLayer = window.dataLayer || []; | ||
function gtag(){dataLayer.push(arguments);} | ||
gtag('js', new Date()); | ||
gtag('config', 'G-PTJE4Z001J');</script><meta name="generator" content="Hexo 7.1.1"></head><body><div class="page-top animated fadeInDown"><div class="nav"><li><a href="/OpenCogData">Home</a></li><li><a href="/OpenCogData/archives">Archives</a></li><li><a href="/OpenCogData/tags">Tags</a></li><li><a href="/OpenCogData/about">About</a></li><li><a href="/OpenCogData/contribute">Contribute</a></li></div><div class="information"><div class="nav_right_btn"><li><a class="fa fa-chevron-left" onclick="window.history.go(-1)"></a></li><li><a class="fa fa-search" onclick="openWindow();"></a></li></div><div class="avatar"><img src="/OpenCogData/images/logo.webp" alt="favicon"></div></div></div><div class="sidebar animated fadeInDown"><div class="sidebar-top"><div class="logo-title"><div class="title"><img src="/OpenCogData/images/logo.webp" style="width:175px;" alt="favicon"><h3 title=""><a href="/OpenCogData">OpenCogData</a></h3><div class="description"><p>A collection of publicly available<br>cognitve task datasets</p></div></div><ul class="social-links"><li><a target="_blank" rel="noopener" href="https://github.com/nimh-dsst/OpenCogData"><i class="fa fa-github"></i></a></li></ul></div></div><div class="footer"><div class="p"> <span> MIT License </span><i class="fa fa-star"></i><span> DSST/NIMH</span></div><div class="by_farbox"><span>Powered by </span><a href="https://hexo.io/zh-cn/" target="_blank">Hexo </a><span> & </span><span>Anatolo </span></div><div class="beian"></div></div></div><div class="main"><div class="autopagerize_page_element"><div class="content"><div class="post-page"><div class="post animated fadeInDown"><div class="post-title"><h3><a>Abeles et al. (2023)</a></h3></div><div class="post-subtitle"><h4>Initial motor skill performance predicts future performance, but not learning</h4></div><div class="post-links"><a target="_blank" rel="noopener" href="https://doi.org/10.1038/s41598-023-38231-5">[Paper] </a><a target="_blank" rel="noopener" href="https://osf.io/rxmpz"> [Data]</a></div><div class="post-content"><p><p>People show vast variability in skill performance and learning. What determines a person’s individual performance and learning ability? In this study we explored the possibility to predict participants’ future performance and learning, based on their behavior during initial skill acquisition. We recruited a large online multi-session sample of participants performing a sequential tapping skill learning task. We used machine learning to predict future performance and learning from raw data acquired during initial skill acquisition, and from engineered features calculated from the raw data. Strong correlations were observed between initial and final performance, and individual learning was not predicted. While canonical experimental tasks developed and selected to detect average effects may constrain insights regarding individual variability, development of novel tasks may shed light on the underlying mechanism of individual skill learning, relevant for real-life scenarios.</p> | ||
gtag('config', 'G-PTJE4Z001J');</script><meta name="generator" content="Hexo 6.3.0"></head><body><div class="page-top animated fadeInDown"><div class="nav"><li><a href="/OpenCogData">Home</a></li><li><a href="/OpenCogData/archives">Archives</a></li><li><a href="/OpenCogData/tags">Tags</a></li><li><a href="/OpenCogData/about">About</a></li><li><a href="/OpenCogData/contribute">Contribute</a></li></div><div class="information"><div class="nav_right_btn"><li><a class="fa fa-chevron-left" onclick="window.history.go(-1)"></a></li><li><a class="fa fa-search" onclick="openWindow();"></a></li></div><div class="avatar"><img src="/OpenCogData/images/logo.webp" alt="favicon"></div></div></div><div class="sidebar animated fadeInDown"><div class="sidebar-top"><div class="logo-title"><div class="title"><img src="/OpenCogData/images/logo.webp" style="width:175px;" alt="favicon"><h3 title=""><a href="/OpenCogData">OpenCogData</a></h3><div class="description"><p>A collection of publicly available<br>cognitve task datasets</p></div></div><ul class="social-links"><li><a target="_blank" rel="noopener" href="https://github.com/nimh-dsst/OpenCogData"><i class="fa fa-github"></i></a></li></ul></div></div><div class="footer"><div class="p"> <span> MIT License </span><i class="fa fa-star"></i><span> DSST/NIMH</span></div><div class="by_farbox"><span>Powered by </span><a href="https://hexo.io/zh-cn/" target="_blank">Hexo </a><span> & </span><span>Anatolo </span></div><div class="beian"></div></div></div><div class="main"><div class="autopagerize_page_element"><div class="content"><div class="post-page"><div class="post animated fadeInDown"><div class="post-title"><h3><a>Abeles et al. (2023)</a></h3></div><div class="post-subtitle"><h4>Initial motor skill performance predicts future performance, but not learning</h4></div><div class="post-links"><a target="_blank" rel="noopener" href="https://doi.org/10.1038/s41598-023-38231-5">[Paper] </a><a target="_blank" rel="noopener" href="https://osf.io/rxmpz"> [Data]</a></div><div class="post-content"><p><p>People show vast variability in skill performance and learning. What determines a person’s individual performance and learning ability? In this study we explored the possibility to predict participants’ future performance and learning, based on their behavior during initial skill acquisition. We recruited a large online multi-session sample of participants performing a sequential tapping skill learning task. We used machine learning to predict future performance and learning from raw data acquired during initial skill acquisition, and from engineered features calculated from the raw data. Strong correlations were observed between initial and final performance, and individual learning was not predicted. While canonical experimental tasks developed and selected to detect average effects may constrain insights regarding individual variability, development of novel tasks may shed light on the underlying mechanism of individual skill learning, relevant for real-life scenarios.</p> | ||
</p></div><div class="post-footer"><div class="meta"><div class="info"><i class="fa fa-tag"></i></div></div></div></div><div class="pagination"><ul class="clearfix"><li class="pre pagbuttons"><a class="btn" role="navigation" href="/OpenCogData/zika-et-al-2023/" title="Zika et al. (2023)">Previous</a></li><li class="next pagbuttons"><a class="btn" role="navigation" href="/OpenCogData/laplante-et-al-2023/" title="Laplante et al. (2023)">Next</a></li></ul></div><script src="/OpenCogData/js/visitors.js"></script></div></div></div></div><script src="/OpenCogData/js/jquery-migrate-1.2.1.min.js"></script><script src="/OpenCogData/js/jquery.appear.js"></script><script src="/OpenCogData/js/add-bookmark.js"></script><script>(function(window){var INSIGHT_CONFIG={TRANSLATION:{POSTS:"Posts",PAGES:"Pages",CATEGORIES:"Categories",TAGS:"Tags",UNTITLED:"(Untitled)",},CONTENT_URL:"/OpenCogData/content.json",};window.INSIGHT_CONFIG=INSIGHT_CONFIG})(window);</script><script src="/OpenCogData/js/insight.js" defer></script><div class="searchbox ins-search"><div class="searchbox-container ins-search-container"><div class="searchbox-input-wrapper"><input class="searchbox-input ins-search-input" type="text" placeholder="Search..."><span class="searchbox-close"><a class="fa fa-times-circle" onclick="closeWindow();"></a></span></div><div class="searchbox-result-wrapper ins-section-wrapper"><div class="ins-section-container"><p>Seraching...</p></div></div></div></div></body></html> |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -1,5 +1,5 @@ | ||
<!DOCTYPE html><html lang="zh-CN"><head><meta charset="utf-8"><meta name="X-UA-Compatible" content="IE=edge"><meta name="author" content="DSST/NIMH"><title>Abir et al. (2023) · OpenCogData</title><meta name="description" content="The purpose of exploration is to reduce goal-relevant uncertainty. This can be achieved by choosing to explore the parts of the environment one is mos"><meta name="keywords"><meta content="width=device-width, initial-scale=1.0, maximum-scale=1.0, user-scalable=0" name="viewport"><meta content="yes" name="apple-mobile-web-app-capable"><meta content="black" name="apple-mobile-web-app-status-bar-style"><meta content="telephone=no" name="format-detection"><meta name="renderer" content="webkit"><link rel="shortcut icon" type="image/x-icon" href="/images/favicon.webp"><link rel="stylesheet" href="/OpenCogData/css/style.css"><link rel="stylesheet" href="/OpenCogData/css/blog_basic.css"><link rel="stylesheet" href="/OpenCogData/css/font-awesome.min.css"><link rel="stylesheet" href="/OpenCogData/css/insight.css"><link rel="stylesheet" href="/OpenCogData/css/search.css"><link rel="alternate" type="application/atom+xml" title="ATOM 1.0" href="/atom.xml"><script src="/OpenCogData/js/jquery.js"></script><!-- Global site tag (gtag.js) - Google Analytics--><script async src="https://www.googletagmanager.com/gtag/js?id=G-PTJE4Z001J"></script><script>window.dataLayer = window.dataLayer || []; | ||
function gtag(){dataLayer.push(arguments);} | ||
gtag('js', new Date()); | ||
gtag('config', 'G-PTJE4Z001J');</script><meta name="generator" content="Hexo 7.1.1"></head><body><div class="page-top animated fadeInDown"><div class="nav"><li><a href="/OpenCogData">Home</a></li><li><a href="/OpenCogData/archives">Archives</a></li><li><a href="/OpenCogData/tags">Tags</a></li><li><a href="/OpenCogData/about">About</a></li><li><a href="/OpenCogData/contribute">Contribute</a></li></div><div class="information"><div class="nav_right_btn"><li><a class="fa fa-chevron-left" onclick="window.history.go(-1)"></a></li><li><a class="fa fa-search" onclick="openWindow();"></a></li></div><div class="avatar"><img src="/OpenCogData/images/logo.webp" alt="favicon"></div></div></div><div class="sidebar animated fadeInDown"><div class="sidebar-top"><div class="logo-title"><div class="title"><img src="/OpenCogData/images/logo.webp" style="width:175px;" alt="favicon"><h3 title=""><a href="/OpenCogData">OpenCogData</a></h3><div class="description"><p>A collection of publicly available<br>cognitve task datasets</p></div></div><ul class="social-links"><li><a target="_blank" rel="noopener" href="https://github.com/nimh-dsst/OpenCogData"><i class="fa fa-github"></i></a></li></ul></div></div><div class="footer"><div class="p"> <span> MIT License </span><i class="fa fa-star"></i><span> DSST/NIMH</span></div><div class="by_farbox"><span>Powered by </span><a href="https://hexo.io/zh-cn/" target="_blank">Hexo </a><span> & </span><span>Anatolo </span></div><div class="beian"></div></div></div><div class="main"><div class="autopagerize_page_element"><div class="content"><div class="post-page"><div class="post animated fadeInDown"><div class="post-title"><h3><a>Abir et al. (2023)</a></h3></div><div class="post-subtitle"><h4>Human Exploration Strategically Balances Approaching and Avoiding Uncertainty</h4></div><div class="post-links"><a target="_blank" rel="noopener" href="https://doi.org/10.31234/osf.io/gtxam">[Paper] </a><a target="_blank" rel="noopener" href="https://osf.io/6zyev/"> [Data]</a></div><div class="post-content"><p><p>The purpose of exploration is to reduce goal-relevant uncertainty. This can be achieved by choosing to explore the parts of the environment one is most uncertain about. Humans, however, often choose to avoid uncertainty. How do humans balance approaching and avoiding uncertainty during exploration? To answer this question, we developed a task requiring participants to explore a simulated environment towards a clear goal. We compared human choices to the predictions of the optimal exploration policy and a hierarchy of simpler strategies. We found that participants generally explored the object they were more uncertain about. However, when overall uncertainty about choice options was high, participants avoided objects they were more uncertain about, learning instead about better known objects. We examined reaction times and individual differences to understand the costs and benefits of this strategy. We conclude that balancing approaching and avoiding uncertainty ameliorates the costs of exploration in a resource-rational manner.</p> | ||
gtag('config', 'G-PTJE4Z001J');</script><meta name="generator" content="Hexo 6.3.0"></head><body><div class="page-top animated fadeInDown"><div class="nav"><li><a href="/OpenCogData">Home</a></li><li><a href="/OpenCogData/archives">Archives</a></li><li><a href="/OpenCogData/tags">Tags</a></li><li><a href="/OpenCogData/about">About</a></li><li><a href="/OpenCogData/contribute">Contribute</a></li></div><div class="information"><div class="nav_right_btn"><li><a class="fa fa-chevron-left" onclick="window.history.go(-1)"></a></li><li><a class="fa fa-search" onclick="openWindow();"></a></li></div><div class="avatar"><img src="/OpenCogData/images/logo.webp" alt="favicon"></div></div></div><div class="sidebar animated fadeInDown"><div class="sidebar-top"><div class="logo-title"><div class="title"><img src="/OpenCogData/images/logo.webp" style="width:175px;" alt="favicon"><h3 title=""><a href="/OpenCogData">OpenCogData</a></h3><div class="description"><p>A collection of publicly available<br>cognitve task datasets</p></div></div><ul class="social-links"><li><a target="_blank" rel="noopener" href="https://github.com/nimh-dsst/OpenCogData"><i class="fa fa-github"></i></a></li></ul></div></div><div class="footer"><div class="p"> <span> MIT License </span><i class="fa fa-star"></i><span> DSST/NIMH</span></div><div class="by_farbox"><span>Powered by </span><a href="https://hexo.io/zh-cn/" target="_blank">Hexo </a><span> & </span><span>Anatolo </span></div><div class="beian"></div></div></div><div class="main"><div class="autopagerize_page_element"><div class="content"><div class="post-page"><div class="post animated fadeInDown"><div class="post-title"><h3><a>Abir et al. (2023)</a></h3></div><div class="post-subtitle"><h4>Human Exploration Strategically Balances Approaching and Avoiding Uncertainty</h4></div><div class="post-links"><a target="_blank" rel="noopener" href="https://doi.org/10.31234/osf.io/gtxam">[Paper] </a><a target="_blank" rel="noopener" href="https://osf.io/6zyev/"> [Data]</a></div><div class="post-content"><p><p>The purpose of exploration is to reduce goal-relevant uncertainty. This can be achieved by choosing to explore the parts of the environment one is most uncertain about. Humans, however, often choose to avoid uncertainty. How do humans balance approaching and avoiding uncertainty during exploration? To answer this question, we developed a task requiring participants to explore a simulated environment towards a clear goal. We compared human choices to the predictions of the optimal exploration policy and a hierarchy of simpler strategies. We found that participants generally explored the object they were more uncertain about. However, when overall uncertainty about choice options was high, participants avoided objects they were more uncertain about, learning instead about better known objects. We examined reaction times and individual differences to understand the costs and benefits of this strategy. We conclude that balancing approaching and avoiding uncertainty ameliorates the costs of exploration in a resource-rational manner.</p> | ||
</p></div><div class="post-footer"><div class="meta"><div class="info"><i class="fa fa-tag"></i><a class="tag" href="/OpenCogData/tags/explore-exploit/" title="explore/exploit">explore/exploit </a><a class="tag" href="/OpenCogData/tags/2-arm-bandit/" title="2-arm bandit">2-arm bandit </a></div></div></div></div><div class="pagination"><ul class="clearfix"><li class="pre pagbuttons"><a class="btn" role="navigation" href="/OpenCogData/filmer-et-al-2023/" title="Filmer et al. (2023)">Previous</a></li><li class="next pagbuttons"><a class="btn" role="navigation" href="/OpenCogData/oguchi-et-al-2023/" title="Oguchi et al. (2023)">Next</a></li></ul></div><script src="/OpenCogData/js/visitors.js"></script></div></div></div></div><script src="/OpenCogData/js/jquery-migrate-1.2.1.min.js"></script><script src="/OpenCogData/js/jquery.appear.js"></script><script src="/OpenCogData/js/add-bookmark.js"></script><script>(function(window){var INSIGHT_CONFIG={TRANSLATION:{POSTS:"Posts",PAGES:"Pages",CATEGORIES:"Categories",TAGS:"Tags",UNTITLED:"(Untitled)",},CONTENT_URL:"/OpenCogData/content.json",};window.INSIGHT_CONFIG=INSIGHT_CONFIG})(window);</script><script src="/OpenCogData/js/insight.js" defer></script><div class="searchbox ins-search"><div class="searchbox-container ins-search-container"><div class="searchbox-input-wrapper"><input class="searchbox-input ins-search-input" type="text" placeholder="Search..."><span class="searchbox-close"><a class="fa fa-times-circle" onclick="closeWindow();"></a></span></div><div class="searchbox-result-wrapper ins-section-wrapper"><div class="ins-section-container"><p>Seraching...</p></div></div></div></div></body></html> |
Oops, something went wrong.