Skip to content

Commit

Permalink
Tapir tutorials: basics, docs, json (#3797)
Browse files Browse the repository at this point in the history
Co-authored-by: Krzysztof Ciesielski <[email protected]>
  • Loading branch information
adamw and kciesielski authored Jun 3, 2024
1 parent 0b8427c commit 2762d7d
Show file tree
Hide file tree
Showing 28 changed files with 707 additions and 163 deletions.
2 changes: 1 addition & 1 deletion .readthedocs.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -10,4 +10,4 @@ python:
build:
os: ubuntu-22.04
tools:
python: "3.7"
python: "3.12"
2 changes: 1 addition & 1 deletion doc/.python-version
Original file line number Diff line number Diff line change
@@ -1 +1 @@
3.7.2
3.12
19 changes: 5 additions & 14 deletions doc/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -32,16 +32,15 @@
# ones.
extensions = ['myst_parser', 'sphinx_rtd_theme']

myst_enable_extensions = ['attrs_block']

# Add any paths that contain templates here, relative to this directory.
templates_path = ['_templates']

# The suffix(es) of source filenames.
# You can specify multiple suffix as a list of string:
#
# source_suffix = ['.rst', '.md']
from recommonmark.parser import CommonMarkParser
from recommonmark.transform import AutoStructify

source_suffix = {
'.rst': 'restructuredtext',
'.txt': 'markdown',
Expand All @@ -53,7 +52,7 @@

# General information about the project.
project = u'tapir'
copyright = u'2023, SoftwareMill'
copyright = u'2024, SoftwareMill'
author = u'SoftwareMill'

# The version info for the project you're documenting, acts as replacement for
Expand All @@ -70,15 +69,15 @@
#
# This is also used if you do content translation via gettext catalogs.
# Usually you set "language" from the command line for these cases.
language = None
language = 'en'

# List of patterns, relative to source directory, that match files and
# directories to ignore when looking for source files.
# This patterns also effect to html_static_path and html_extra_path
exclude_patterns = ['_build', 'Thumbs.db', '.DS_Store']

# The name of the Pygments (syntax highlighting) style to use.
pygments_style = 'sphinx'
pygments_style = 'default'

# If true, `todo` and `todoList` produce output, else they produce nothing.
todo_include_todos = False
Expand Down Expand Up @@ -184,11 +183,3 @@
'github_version': 'master', # Version
'conf_py_path': '/doc/', # Path in the checkout to the docs root
}

# app setup hook
def setup(app):
app.add_config_value('recommonmark_config', {
'auto_toc_tree_section': 'Contents',
'enable_auto_doc_ref': False
}, True)
app.add_transform(AutoStructify)
10 changes: 4 additions & 6 deletions doc/docs/openapi.md
Original file line number Diff line number Diff line change
Expand Up @@ -243,12 +243,10 @@ security requirements will be created for them. However, this will not include t
mandatory. If authentication should be optional, an empty security requirement will be added if an `emptyAuth` input
is added (which doesn't map to any values in the request, but only serves as a marker).

```eval_rst
.. note::
Note that even though multiple optional authentication methods might be rendered as alternatives in the documentation,
when running the server, you'll need to additionally check that at least one authentication input is provided. This
can be done in the security logic, server logic, or by mapping the inputs using .mapDecode, as in the below example:
```{note}
Note that even though multiple optional authentication methods might be rendered as alternatives in the documentation,
when running the server, you'll need to additionally check that at least one authentication input is provided. This
can be done in the security logic, server logic, or by mapping the inputs using .mapDecode, as in the below example:
```

```scala mdoc:compile-only
Expand Down
2 changes: 1 addition & 1 deletion doc/endpoint/basics.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ val userEndpoint: PublicEndpoint[(UUID, Int), String, User, Any] = ???
You can think of an endpoint as a function which takes input parameters of type `A` and `I` and returns a result of type
`Either[E, O]`.

### Infallible endpoints
## Infallible endpoints

Note that the empty `endpoint` description maps no values to either error and success outputs, however errors
are still represented and allowed to occur. In case of the error output, the single member of the unit type, `(): Unit`,
Expand Down
10 changes: 4 additions & 6 deletions doc/endpoint/customtypes.md
Original file line number Diff line number Diff line change
Expand Up @@ -80,12 +80,10 @@ import sttp.tapir.Codec.PlainCodec
implicit val myIdCodec: PlainCodec[MyId] = Codec.string.mapDecode(decode)(encode)
```

```eval_rst
.. note::
Note that inputs/outputs can also be mapped over. In some cases, it's enough to create an input/output corresponding
to one of the existing types, and then map over them. However, if you have a type that's used multiple times, it's
usually better to define a codec for that type.
```{note}
Note that inputs/outputs can also be mapped over. In some cases, it's enough to create an input/output corresponding
to one of the existing types, and then map over them. However, if you have a type that's used multiple times, it's
usually better to define a codec for that type.
```

Then, you can use the new codec; e.g. to obtain an id from a query parameter, or a path segment:
Expand Down
10 changes: 4 additions & 6 deletions doc/endpoint/enumerations.md
Original file line number Diff line number Diff line change
Expand Up @@ -150,12 +150,10 @@ properly represented in [OpenAPI](../docs/openapi.md) documentation.

You can take a look at a runnable example [here](https://github.com/softwaremill/tapir/tree/master/examples/src/main/scala/sttp/tapir/examples/custom_types).

```eval_rst
.. warning::
``Delimited`` and ``CommaSeparated`` rely on literal types, which are only available in Scala 2.13+.
```{warning}
`Delimited` and `CommaSeparated` rely on literal types, which are only available in Scala 2.13+.
If you're using an older version of Scala, a workaround is creating a comma-separated codec locally.
If you're using an older version of Scala, a workaround is creating a comma-separated codec locally.
```

## Using enumerations as part of bodies
Expand Down Expand Up @@ -272,4 +270,4 @@ is that an enumeration [validator](validation.md) has to be added to the schema.

## Next

Read on about [validation](validation.md).
Read on about [validation](validation.md).
10 changes: 4 additions & 6 deletions doc/endpoint/integrations.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,9 @@
# Datatypes integrations

```eval_rst
.. note::
Note that the codecs defined by the tapir integrations are used only when the specific types (e.g. enumerations) are
used at the top level. Any nested usages (e.g. as part of a json body), need to be separately configured to work with
the used json library.
```{note}
Note that the codecs defined by the tapir integrations are used only when the specific types (e.g. enumerations) are
used at the top level. Any nested usages (e.g. as part of a json body), need to be separately configured to work with
the used json library.
```

## Cats datatypes integration
Expand Down
12 changes: 5 additions & 7 deletions doc/endpoint/json.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,13 +11,11 @@ an implicit `Schema[T]` instance, which can be automatically derived. For more d
[schema derivation](schemas.md) and on supporting [custom types](customtypes.md) in general. Such a design provides
better error reporting, in case one of the components required to create the json codec is missing.

```eval_rst
.. note::
Note that the process of deriving schemas, and deriving library-specific json encoders and decoders is entirely
separate (unless you're using the pickler module - see below). The first is controlled by tapir, the second - by the
json library. Any customisation, e.g. for field naming or inheritance strategies, must be done separately for both
derivations.
```{note}
Note that the process of deriving schemas, and deriving library-specific json encoders and decoders is entirely
separate (unless you're using the pickler module - see below). The first is controlled by tapir, the second - by the
json library. Any customisation, e.g. for field naming or inheritance strategies, must be done separately for both
derivations.
```

## Pickler
Expand Down
30 changes: 12 additions & 18 deletions doc/endpoint/oneof.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,15 +5,13 @@ There are two kind of one-of inputs/outputs:
* `oneOf` outputs where the arbitrary-output variants can represent different content using different outputs, and
* `oneOfBody` input/output where the body-only variants represent the same content, but with different content types

```eval_rst
.. note::
``oneOf`` and ``oneOfBody`` outputs are not related to ``oneOf:`` schemas when
`generating <https://tapir.softwaremill.com/en/latest/docs/openapi.html>`_ OpenAPI documentation.
Such schemas are generated for coproducts - e.g. ``sealed trait`` families - given an appropriate codec. See the
documentation on
`coproducts <https://tapir.softwaremill.com/en/latest/endpoint/schemas.html#sealed-traits-coproducts>`_ for details.
```{note}
`oneOf` and `oneOfBody` outputs are not related to `oneOf:` schemas when
[generating](https://tapir.softwaremill.com/en/latest/docs/openapi.html) OpenAPI documentation.
Such schemas are generated for coproducts - e.g. `sealed trait` families - given an appropriate codec. See the
documentation on [coproducts](https://tapir.softwaremill.com/en/latest/endpoint/schemas.html#sealed-traits-coproducts)
for details.
```

## `oneOf` outputs
Expand Down Expand Up @@ -222,17 +220,13 @@ on streaming bodies, which "lifts" them to an `EndpointIO` type, forgetting the
type safety, as a run-time error might occur if an incompatible interpreter is used, however allows describing
endpoints which require including streaming bodies in output variants.

```eval_rst
.. note::
If the same streaming body description is used in all branches of a ``oneOf``, this can be refactored into
a regular streaming body output + a varying set of output headers, expressed using ``oneOf``.
```{note}
If the same streaming body description is used in all branches of a `oneOf`, this can be refactored into
a regular streaming body output + a varying set of output headers, expressed using `oneOf`.
```

```eval_rst
.. warning::
Mixed streaming and non-streaming bodies defined as ``oneOf`` variants currently won't work with client interpreters.
```{warning}
Mixed streaming and non-streaming bodies defined as `oneOf` variants currently won't work with client interpreters.
```

## Next
Expand Down
18 changes: 8 additions & 10 deletions doc/endpoint/schemas.md
Original file line number Diff line number Diff line change
Expand Up @@ -149,16 +149,14 @@ implicit val anotherSchemaForMyCustomType: Schema[MyCustomType] = Schema(SchemaT
Schema derivation for coproduct types (sealed hierarchies) is supported as well. By default, such hierarchies
will be represented as a coproduct which contains a list of child schemas, without any discriminators.

```eval_rst
.. note::
Note that whichever approach you choose to define the coproduct schema, it has to match the way the value is
encoded and decoded by the codec. E.g. when the schema is for a json body, the discriminator must be separately
configured in the json library, matching the configuration of the schema.
Alternatively, instead of deriving schemas and json codecs separately, you can use the experimental
`pickler <https://tapir.softwaremill.com/en/latest/endpoint/pickler.html>`_
module, which provides a higher level ``Pickler`` concept, which takes care of consistent derivation.
```{note}
Note that whichever approach you choose to define the coproduct schema, it has to match the way the value is
encoded and decoded by the codec. E.g. when the schema is for a json body, the discriminator must be separately
configured in the json library, matching the configuration of the schema.
Alternatively, instead of deriving schemas and json codecs separately, you can use the experimental
[pickler](https://tapir.softwaremill.com/en/latest/endpoint/pickler.html)
module, which provides a higher level `Pickler` concept, which takes care of consistent derivation.
```

### Field discriminators
Expand Down
10 changes: 4 additions & 6 deletions doc/endpoint/security.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,12 +15,10 @@ Inputs which map to authentication credentials can be created using methods avai
inputs in addition to the base input (such as an `Authorization` header or a cookie), contain security-related metadata,
for example the name of the security scheme that should be used for documentation.

```eval_rst
.. note::
Note that security inputs added using ``.securityIn`` can contain both dedicated auth credentials inputs created
using one of the methods from ``auth``, and arbitrary "regular" inputs, such as path components. Similarly, regular
inputs can contain inputs created through ``auth``, though typically this shouldn't be the case.
```{note}
Note that security inputs added using `.securityIn` can contain both dedicated auth credentials inputs created
using one of the methods from `auth`, and arbitrary "regular" inputs, such as path components. Similarly, regular
inputs can contain inputs created through `auth`, though typically this shouldn't be the case.
```

Currently, the following authentication inputs are available (assuming `import sttp.tapir._`):
Expand Down
7 changes: 3 additions & 4 deletions doc/endpoint/static.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,9 @@
Tapir contains predefined endpoints, server logic and server endpoints which allow serving static content, originating
from local files or application resources. These endpoints respect etags, byte ranges as well as if-modified-since headers.

```eval_rst
.. note::
Since Tapir 1.3.0, static content is supported via the new `tapir-files` module. If you're looking for
the API documentation of the old static content API, switch documentation to an older version.
```{note}
Since Tapir 1.3.0, static content is supported via the new `tapir-files` module. If you're looking for
the API documentation of the old static content API, switch documentation to an older version.
```

In order to use static content endpoints, add the module to your dependencies:
Expand Down
14 changes: 6 additions & 8 deletions doc/endpoint/streaming.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,13 +5,11 @@ must implement the `Streams[S]` capability, and determines the precise type of t
non-blocking streams implementation. The interpreter must then support the given capability. Refer to the documentation
of server/client interpreters for more information.

```eval_rst
.. note::
Here, streams refer to asynchronous, non-blocking, "reactive" stream implementations, such as `akka-streams <https://doc.akka.io/docs/akka/current/stream/index.html>`_,
`fs2 <https://fs2.io>`_ or `zio-streams <https://zio.dev/docs/datatypes/datatypes_stream>`_. If you'd like to use
blocking streams (such as ``InputStream``), these are available through e.g. ``inputStreamBody`` without any
additional requirements on the interpreter.
```{note}
Here, streams refer to asynchronous, non-blocking, "reactive" stream implementations, such as [akka-streams](https://doc.akka.io/docs/akka/current/stream/index.html),
[fs2](https://fs2.io) or [zio-streams](https://zio.dev/docs/datatypes/datatypes_stream). If you'd like to use
blocking streams (such as `InputStream`), these are available through e.g. `inputStreamBody` without any
additional requirements on the interpreter.
```

Adding a stream body input/output influences both the type of the input/output, as well as the 5th type parameter
Expand Down Expand Up @@ -45,4 +43,4 @@ See also the [runnable streaming example](../examples.md).

## Next

Read on about [web sockets](websockets.md).
Read on about [web sockets](websockets.md).
14 changes: 6 additions & 8 deletions doc/endpoint/xml.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,14 +4,12 @@ Enabling support for XML is a matter of implementing proper [`XmlCodec[T]`](code
This enables encoding objects to XML strings, and decoding XML strings to objects.
Implementation is fairly easy, and for now, one guide on how to integrate with scalaxb is provided.

```eval_rst
.. note::
Note, that implementing ``XmlCodec[T]`` would require deriving not only XML library encoders/decoders,
but also tapir related ``Schema[T]``. These are completely separate - any customization e.g. for field
naming or inheritance strategies must be done separately for both derivations.
For more details see sections on `schema derivation <https://tapir.softwaremill.com/en/latest/endpoint/schemas.html>`_
and on supporting `custom types <https://tapir.softwaremill.com/en/latest/endpoint/customtypes.html>`_ in general.
```{note}
Note, that implementing `XmlCodec[T]` would require deriving not only XML library encoders/decoders,
but also tapir related `Schema[T]`. These are completely separate - any customization e.g. for field
naming or inheritance strategies must be done separately for both derivations.
For more details see sections on [schema derivation](schemas.md) and on supporting [custom types](customtypes.md) in
general.
```

## Scalaxb
Expand Down
10 changes: 4 additions & 6 deletions doc/generator/sbt-openapi-codegen.md
Original file line number Diff line number Diff line change
@@ -1,9 +1,7 @@
# Generate endpoint definitions from an OpenAPI YAML

```eval_rst
.. note::
This is a really early alpha implementation.
```{note}
This is a really early alpha implementation.
```

## Installation steps
Expand Down Expand Up @@ -34,7 +32,7 @@ defined case-classes and endpoint definitions.

The generator currently supports these settings, you can override them in the `build.sbt`;

```eval_rst
```{eval-rst}
===================================== ==================================== ==================================================================================================
setting default value description
===================================== ==================================== ==================================================================================================
Expand Down Expand Up @@ -92,7 +90,7 @@ having no tags, would be output to the `TapirGeneratedEndpoints` file, along wit

### Json Support

```eval_rst
```{eval-rst}
===================== ================================================================== ===================================================================
openapiJsonSerdeLib required dependencies Conditional requirements
===================== ================================================================== ===================================================================
Expand Down
12 changes: 10 additions & 2 deletions doc/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ for a more detailed description of how tapir works! ScalaDocs are available at [

## Adopt a tapir

```eval_rst
```{eval-rst}
.. raw:: html
<iframe
Expand Down Expand Up @@ -200,7 +200,7 @@ We offer commercial support for sttp and related technologies, as well as develo

## Table of contents

```eval_rst
```{eval-rst}
.. toctree::
:maxdepth: 2
:caption: Getting started
Expand All @@ -209,6 +209,14 @@ We offer commercial support for sttp and related technologies, as well as develo
examples
stability
.. toctree::
:maxdepth: 2
:caption: Tutorials
tutorials/01_hello_world
tutorials/02_openapi_docs
tutorials/03_json
.. toctree::
:maxdepth: 2
:caption: Endpoints
Expand Down
9 changes: 4 additions & 5 deletions doc/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,5 +1,4 @@
sphinx_rtd_theme==1.0.0
recommonmark==0.7.1
sphinx==4.2.0
sphinx-autobuild==2021.3.14
myst-parser==0.15.2
sphinx_rtd_theme==2.0.0
sphinx==7.3.7
sphinx-autobuild==2024.4.16
myst-parser==2.0.0
Loading

0 comments on commit 2762d7d

Please sign in to comment.