Skip to content

Commit

Permalink
update invited talks schedule
Browse files Browse the repository at this point in the history
  • Loading branch information
GavinKerrigan authored Apr 22, 2024
1 parent 654e76e commit 0a95b7c
Showing 1 changed file with 6 additions and 7 deletions.
13 changes: 6 additions & 7 deletions schedule.md
Original file line number Diff line number Diff line change
Expand Up @@ -41,12 +41,12 @@ Opening remarks
09:00-10:00
</td>
<td>
<a href="{{ "/invited.html" | relative_url }}">Keynote Talk: Aaditya Ramdas (CMU)</a>
<a href="{{ "/invited.html" | relative_url }}">Keynote Talk: Matt Hoffman (DeepMind)</a>
<details>
<summary>
Conformal Online Model Aggregation
TBD
</summary>
Conformal prediction equips machine learning models with a reasonable notion of uncertainty quantification without making strong distributional assumptions. It wraps around any black-box prediction model and converts point predictions into set predictions that have a predefined marginal coverage guarantee. However, conformal prediction only works if we fix the underlying machine learning model in advance. A relatively unaddressed issue in conformal prediction is that of model selection and/or aggregation: for a given problem, which of the plethora of prediction methods (random forests, neural nets, regularized linear models, etc.) should we conformalize? This talk presents a new approach towards conformal model aggregation in online settings that is based on combining the prediction sets from several algorithms by voting, where weights on the models are adapted over time based on past performance.
TBD
</details>
</td>
</tr>
Expand Down Expand Up @@ -190,14 +190,13 @@ Mentoring Event | TBD
09:00-10:00
</td>
<td>
<a href="{{ "/invited.html" | relative_url }}">Keynote Talk: Matt Hoffman (DeepMind)</a>
<a href="{{ "/invited.html" | relative_url }}">Keynote Talk: Aaditya Ramdas (CMU)</a>
<details>
<summary>
TBD
Conformal Online Model Aggregation
</summary>
TBD
Conformal prediction equips machine learning models with a reasonable notion of uncertainty quantification without making strong distributional assumptions. It wraps around any black-box prediction model and converts point predictions into set predictions that have a predefined marginal coverage guarantee. However, conformal prediction only works if we fix the underlying machine learning model in advance. A relatively unaddressed issue in conformal prediction is that of model selection and/or aggregation: for a given problem, which of the plethora of prediction methods (random forests, neural nets, regularized linear models, etc.) should we conformalize? This talk presents a new approach towards conformal model aggregation in online settings that is based on combining the prediction sets from several algorithms by voting, where weights on the models are adapted over time based on past performance.
</details>

</td>
</tr>

Expand Down

0 comments on commit 0a95b7c

Please sign in to comment.