-
Notifications
You must be signed in to change notification settings - Fork 37
Architecture
- pyFF is a SAML metadata processor.
- pyFF can be used to fetch, validate, verify, sign, transform, store, index and search SAML metadata.
- pyFF does its work by following small programs called pipelines written in yaml syntax consisting of steps/primitives called pipes
Deploying pyFF typically means figuring out what SAML metadata to fetch and from where, how to transform it to suit your metadata consumers expectations and needs and finally publish the metadata. All these steps are encoded in a yaml-file called a pipeline which pyFF executes like a program.
The following diagram illustrates the relationships between the elements that make up the pyFF execution model.
- a pipeline transforms an initial state to an object
- the pyFF metadata store is updated using the load pipe and queried using the select step.
Most pyFF pipelines will contain at least one select statement and one load statement. The latter is used to fetch metadata (either local or remote) and the former is used to populate the active tree which subsequent pipes modify to form the resulting object. The object resulting from a pipeline execution (and indeed from each step in the pipeline) is often called the active tree because it is usually a SAML metadata DOM tree. However since some pipes transform the DOM to another type of object (eg to a JSON representation) the term active tree is discouraged.