Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

A reduced functional that updates a control over space and time in one nonlinear solve. #3592

Draft
wants to merge 38 commits into
base: master
Choose a base branch
from
Draft
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
38 commits
Select commit Hold shift + click to select a range
f1ca5d4
initial sketch functional
dham May 22, 2024
52bf904
aaorf wip
JHopeCollins Jul 24, 2024
fa7e75d
Merge branch 'master' into allatoncereducedfunctional
JHopeCollins Aug 15, 2024
7f0e42e
initial impl for strong constraint 4DVar reduced functional
JHopeCollins Aug 16, 2024
7315a4e
aaorf: separate error calculations and inner products for observation…
JHopeCollins Aug 17, 2024
9dfd674
aaorf: decorator for strong constraint method implementations
JHopeCollins Aug 17, 2024
80fcbe4
aaorf: make strong_reduced_functional a cached_property
JHopeCollins Aug 20, 2024
bc8a868
aaorf: optimize_tape method
JHopeCollins Aug 20, 2024
9a388e8
aaorf docstring
JHopeCollins Aug 20, 2024
29c5f8c
Update firedrake/adjoint/all_at_once_reduced_functional.py
JHopeCollins Aug 20, 2024
8eadf31
aaorf docstrings
JHopeCollins Aug 20, 2024
551ad2c
aaorf - weak constraint __call__ impl
JHopeCollins Aug 21, 2024
9345a82
aaorf derivative initial impl
JHopeCollins Aug 23, 2024
78aff18
Merge branch 'master' into allatoncereducedfunctional
JHopeCollins Aug 23, 2024
bbfb40d
fixed mark controls in aaorf
JHopeCollins Sep 19, 2024
30d8389
aaorf - use inbuilt cached_property
JHopeCollins Sep 19, 2024
f00ab63
Merge branch 'master' into allatoncereducedfunctional
JHopeCollins Sep 30, 2024
23f9d11
aaorf - use _accumulate_functional for strong constraint, weak constr…
JHopeCollins Oct 10, 2024
eee3cb2
aaorf - hessian by post-processing tapes of all sub-ReducedFunctionals
JHopeCollins Oct 10, 2024
b85fcc6
aaorf - forward __getattr__ to strong constraint ReducedFunctional
JHopeCollins Oct 10, 2024
d6ccf94
aaorf - each sub-ReducedFunctional has its own tape
JHopeCollins Oct 10, 2024
43da7cf
aaorf - separate tapes for error vectors and error reductions
JHopeCollins Oct 10, 2024
b7cee1d
Merge branch 'allatoncereducedfunctional' of https://github.com/fired…
JHopeCollins Oct 10, 2024
c10cd1a
Merge branch 'master' into allatoncereducedfunctional
JHopeCollins Oct 10, 2024
3f42d03
aaorf - type hinting in signature not docstring
JHopeCollins Oct 11, 2024
393ebbe
aaorf - split observation error tape - vector and reduction
JHopeCollins Oct 22, 2024
086877a
aaorf - split tapes: background to error vector and error reduction; …
JHopeCollins Oct 22, 2024
b7fecb2
aaorf - tidy up after splitting tapes: remove old code; pass riesz op…
JHopeCollins Oct 22, 2024
e48aba7
aaorf - docstrings
JHopeCollins Oct 22, 2024
b1020d2
Update firedrake/adjoint/all_at_once_reduced_functional.py
JHopeCollins Oct 24, 2024
afa377c
Merge branch 'master' into allatoncereducedfunctional
JHopeCollins Nov 4, 2024
948c818
aaorf - swap firedrake specific functions for pyadjoint ones
JHopeCollins Nov 4, 2024
89a10dc
Merge branch 'master' into allatoncereducedfunctional
JHopeCollins Nov 20, 2024
119de1a
Merge branch 'master' into allatoncereducedfunctional
JHopeCollins Nov 29, 2024
886f831
Merge remote-tracking branch 'origin' into allatoncereducedfunctional
JHopeCollins Nov 29, 2024
b490712
Merge branch 'master' into allatoncereducedfunctional
JHopeCollins Dec 17, 2024
4701f55
Merge branch 'master' into allatoncereducedfunctional
JHopeCollins Jan 2, 2025
dfe98bf
`AllAtOnceReducedFunctional` - initial time-parallel implementation (…
JHopeCollins Jan 7, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -84,6 +84,7 @@ jobs:
--install defcon \
--install gadopt \
--install asQ \
--package-branch pyadjoint JHopeCollins/mark_evaluate_tlm \
|| (cat firedrake-install.log && /bin/false)
- name: Install test dependencies
run: |
Expand Down
1 change: 1 addition & 0 deletions firedrake/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -103,6 +103,7 @@
from firedrake.vector import *
from firedrake.version import __version__ as ver, __version_info__, check # noqa: F401
from firedrake.ensemble import *
from firedrake.ensemblefunction import *
from firedrake.randomfunctiongen import *
from firedrake.external_operators import *
from firedrake.progress_bar import ProgressBar # noqa: F401
Expand Down
1 change: 1 addition & 0 deletions firedrake/adjoint/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -38,6 +38,7 @@
from firedrake.adjoint.ufl_constraints import UFLInequalityConstraint, \
UFLEqualityConstraint # noqa F401
from firedrake.adjoint.ensemble_reduced_functional import EnsembleReducedFunctional # noqa F401
from firedrake.adjoint.fourdvar_reduced_functional import FourDVarReducedFunctional # noqa F401
import numpy_adjoint # noqa F401
import firedrake.ufl_expr
import types
Expand Down
332 changes: 332 additions & 0 deletions firedrake/adjoint/composite_reduced_functional.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,332 @@
from pyadjoint import stop_annotating, get_working_tape, OverloadedType, Control, Tape, ReducedFunctional
from pyadjoint.enlisting import Enlist
from typing import Optional


def intermediate_options(options: dict):
"""
Options set for the intermediate stages of a chain of ReducedFunctionals

Takes all elements of the options except riesz_representation, which
is set to None to prevent returning derivatives to the primal space.

Parameters
----------
options
The dictionary of options provided by the user

Returns
-------
dict
The options for ReducedFunctionals at intermediate stages

"""
return {
**{k: v for k, v in (options or {}).items()
if k != 'riesz_representation'},
'riesz_representation': None
}


def compute_tlm(J: OverloadedType,
m: Control,
m_dot: OverloadedType,
options: Optional[dict] = None,
tape: Optional[Tape] = None):
"""
Compute the tangent linear model of J in a direction m_dot at the current value of m

Parameters
----------

J
The objective functional.
m
The (list of) :class:`pyadjoint.Control` for the functional.
m_dot
The direction in which to compute the Hessian.
Must be a (list of) :class:`pyadjoint.OverloadedType`.
options
A dictionary of options. To find a list of available options
have a look at the specific control type.
tape
The tape to use. Default is the current tape.

Returns
-------
pyadjoint.OverloadedType
The tangent linear with respect to the control in direction m_dot.
Should be an instance of the same type as the control.

"""
tape = tape or get_working_tape()

# reset tlm values
tape.reset_tlm_values()

m = Enlist(m)
m_dot = Enlist(m_dot)

# set initial tlm values
for mi, mdi in zip(m, m_dot):
mi.tlm_value = mdi

# evaluate tlm
with stop_annotating():
with tape.marked_nodes(m):
tape.evaluate_tlm(markings=True)

# return functional's tlm
return J._ad_convert_type(J.block_variable.tlm_value,
options=options or {})


def compute_hessian(J: OverloadedType,
m: Control,
options: Optional[dict] = None,
tape: Optional[Tape] = None,
hessian_value: Optional[OverloadedType] = 0.):
"""
Compute the Hessian of J at the current value of m with the current tlm values on the tape.

Parameters
----------
J
The objective functional.
m
The (list of) :class:`pyadjoint.Control` for the functional.
options
A dictionary of options. To find a list of available options
have a look at the specific control type.
tape
The tape to use. Default is the current tape.
hessian_value
The initial hessian_value to start accumulating from.

Returns
-------
pyadjoint.OverloadedType
The second derivative with respect to the control in direction m_dot.
Should be an instance of the same type as the control.

"""
tape = tape or get_working_tape()

# reset hessian values
tape.reset_hessian_values()

m = Enlist(m)

# set initial hessian_value
J.block_variable.hessian_value = J._ad_convert_type(
hessian_value, options=intermediate_options(options))

# evaluate hessian
with stop_annotating():
with tape.marked_nodes(m):
tape.evaluate_hessian(markings=True)

# return controls' hessian values
return m.delist([v.get_hessian(options=options or {}) for v in m])


def tlm(rf: ReducedFunctional,
m_dot: OverloadedType,
options: Optional[dict] = None):
"""Returns the action of the tangent linear model of the functional w.r.t. the control on a vector m_dot.

Parameters
----------
rf
The :class:`pyadjoint.ReducedFunctional` to evaluate the tlm of.
m_dot
The direction in which to compute the action of the tangent linear model.
options
A dictionary of options. To find a list of available options
have a look at the specific control type.

Returns
-------
pyadjoint.OverloadedType
The action of the tangent linear model in the direction m_dot.
Should be an instance of the same type as the control.

"""
return compute_tlm(rf.functional, rf.controls, m_dot,
tape=rf.tape, options=options)


def hessian(rf: ReducedFunctional,
options: Optional[dict] = None,
hessian_value: Optional[OverloadedType] = 0.):
"""Returns the action of the Hessian of the functional w.r.t. the control.

Using the second-order adjoint method, the action of the Hessian of the
functional with respect to the control, around the last supplied value
of the control and the last tlm values, is computed and returned.

Parameters
----------
rf
The :class:`pyadjoint.ReducedFunctional` to evaluate the tlm of.
options
A dictionary of options. To find a list of available options
have a look at the specific control type.
hessian_value
The initial hessian_value to start accumulating from.

Returns
-------
pyadjoint.OverloadedType
The action of the Hessian. Should be an instance of the same type as the control.

"""
return rf.controls.delist(
compute_hessian(rf.functional, rf.controls,
tape=rf.tape, options=options,
hessian_value=hessian_value))


class CompositeReducedFunctional:
"""Class representing the composition of two reduced functionals.

For two reduced functionals J1: X->Y and J2: Y->Z, this is a convenience
class representing the composition J12: X->Z = J2(J1(x)) and providing
methods for the evaluation, derivative, tlm, and hessian action of J12.

Parameters
----------
rf1
The first :class:`pyadjoint.ReducedFunctional` in the composition.
rf2
The second :class:`pyadjoint.ReducedFunctional` in the composition.
The control for rf2 must have the same type as the functional of rf1.

"""
def __init__(self, rf1, rf2):
self.rf1 = rf1
self.rf2 = rf2

def __call__(self, values: OverloadedType):
"""Computes the reduced functional with supplied control value.

Parameters
----------

values
If you have multiple controls this should be a list of new values
for each control in the order you listed the controls to the constructor.
If you have a single control it can either be a list or a single object.
Each new value should have the same type as the corresponding control.

Returns
-------
pyadjoint.OverloadedType
The computed value. Typically of instance of :class:`pyadjoint.AdjFloat`.

"""
return self.rf2(self.rf1(values))

def derivative(self, adj_input: Optional[float] = 1.0, options: Optional[dict] = None):
"""Returns the derivative of the functional w.r.t. the control.
Using the adjoint method, the derivative of the functional with
respect to the control, around the last supplied value of the
control, is computed and returned.

Parameters
----------
adj_input
The adjoint input.

options
Additional options for the derivative computation.

Returns
-------
pyadjoint.OverloadedType
The derivative with respect to the control.
Should be an instance of the same type as the control.

"""
deriv2 = self.rf2.derivative(
adj_input=adj_input, options=intermediate_options(options))
deriv1 = self.rf1.derivative(
adj_input=deriv2, options=options or {})
return deriv1

def tlm(self, m_dot: OverloadedType, options: Optional[dict] = None):
"""Returns the action of the tangent linear model of the functional w.r.t. the control on a vector m_dot.

Parameters
----------

m_dot
The direction in which to compute the action of the Hessian.

options
A dictionary of options. To find a list of available options
have a look at the specific control type.

Returns
-------
pyadjoint.OverloadedType
The action of the Hessian in the direction m_dot.
Should be an instance of the same type as the control.

"""
tlm1 = self._eval_tlm(
self.rf1, m_dot, intermediate_options(options)),
tlm2 = self._eval_tlm(
self.rf2, tlm1, options)
return tlm2

def hessian(self, m_dot: OverloadedType,
options: Optional[dict] = None,
evaluate_tlm: Optional[bool] = True):
"""Returns the action of the Hessian of the functional w.r.t. the control on a vector m_dot.

Using the second-order adjoint method, the action of the Hessian of the
functional with respect to the control, around the last supplied value
of the control, is computed and returned.

Parameters
----------

m_dot
The direction in which to compute the action of the Hessian.

options
A dictionary of options. To find a list of available options
have a look at the specific control type.

evaluate_tlm
If True, the tlm values on the tape will be reset and evaluated before
the Hessian action is evaluated. If False, the existing tlm values on
the tape will be used.

Returns
-------
pyadjoint.OverloadedType
The action of the Hessian in the direction m_dot.
Should be an instance of the same type as the control.

"""
if evaluate_tlm:
self.tlm(m_dot, options=intermediate_options(options))
hess2 = self._eval_hessian(
self.rf2, 0., intermediate_options(options))
hess1 = self._eval_hessian(
self.rf1, hess2, options or {})
return hess1

def _eval_tlm(self, rf, m_dot, options):
if isinstance(rf, CompositeReducedFunctional):
return rf.tlm(m_dot, options=options)
else:
return tlm(rf, m_dot=m_dot, options=options)

def _eval_hessian(self, rf, hessian_value, options):
if isinstance(rf, CompositeReducedFunctional):
return rf.hessian(None, options, evaluate_tlm=False)
else:
return hessian(rf, hessian_value=hessian_value, options=options)
Loading
Loading