Skip to content

Commit

Permalink
Merge pull request #1 from gpac/fix/lists-formatting
Browse files Browse the repository at this point in the history
fix list formatting
  • Loading branch information
nlsdvl authored Mar 5, 2024
2 parents da8c898 + 8d96311 commit 5acd190
Show file tree
Hide file tree
Showing 41 changed files with 122 additions and 0 deletions.
1 change: 1 addition & 0 deletions docs/Build/Upgrading.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ If you build GPAC directly in the source tree (i.e., running `./configure && mak
# Out of source tree building

To avoid the issue of cleaning dependencies, it is safer to have one dedicated build directory for each branch you test:

- `mkdir bin/master && cd bin/master && ../../configure && make -j`
- `mkdir bin/somebranch && cd bin/master && git checkout somebranch && ../../configure && make -j`

Expand Down
3 changes: 3 additions & 0 deletions docs/Build/build/GPAC-Build-Guide-for-Linux.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
_Preliminary notes: the following instructions will be based on Ubuntu and Debian. It should be easily applicable to other distributions, the only changes should be name of the packages to be installed, and the package manager used._

GPAC is a modular piece of software which depends on third-party libraries. During the build process it will try to detect and leverage the installed third-party libraries on your system. Here are the instructions to:

* build GPAC easily (recommended for most users) from what's available on your system,
* build a minimal 'MP4Box' and 'gpac' (only contains GPAC core features like muxing and streaming),
* build a complete GPAC by rebuilding all the dependencies manually.
Expand Down Expand Up @@ -47,6 +48,7 @@ _If you are upgrading from a previous version (especially going from below 1.0.0
## Use

You can either:

- `sudo make install` to install the binaries,
- or use the `MP4Box` or `gpac` binary in `gpac_public/bin/gcc/` directly,
- or move/copy it somewhere manually.
Expand Down Expand Up @@ -104,6 +106,7 @@ make
4. Use

You can either:

- `sudo make install` to install the binaries,
- or use the `MP4Box` or `gpac` binary in `gpac_public/bin/gcc/` directly,
- or move/copy it somewhere manually.
Expand Down
4 changes: 4 additions & 0 deletions docs/Filters/Rearchitecture.md
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ The following lists the core principles of the re-architecture. Read the [genera

# Filter Design Principles
A filter object obeys the following principles:

- may accept (consume) any number of data stream (named `PID` in this architecture)
- may produce any number of PIDs
- can have its input PIDs reconfigured at run time or even removed
Expand Down Expand Up @@ -65,6 +66,7 @@ The filter session main features are:
- handle filters capability negotiation, usually inserting a filter chain to match the desired format

The filter session operates in a semi-blocking mode:

- it prevents filters in blocking mode (output PIDs buffers full) to operate
- it will not prevent a running filter to dispatch a packet; this greatly simplifies demultiplexers writing

Expand All @@ -82,6 +84,7 @@ These properties may also be overloaded by the user, e.g. to assign a ServiceID
# Media Streams internal representation

In order to be able to exchange media stream data between filters, a unified data format had to be set, as follows:

- a frame is defined as a single-time block of data (Access Unit in MPEG terminology), but can be transferred in multiple packets
- frames or fragments of frames are always transferred in processing order (e.g. decoding order for MPEG video)
- multiplexed media data is identified as `file` data, where a frame is a complete file.
Expand Down Expand Up @@ -155,6 +158,7 @@ gpac -i source.mp4 reframer:xround=closest:splitrange:xs=2:xe=4 -o dest.mp4
All other functionalities of MP4Box are not available through a filter session. Some might make it one day (BIFS encoding for example), but most of them are not good candidates for filter-based processing and will only be available through MP4Box (track add/remove to existing file, image item add/remove to existing file, file hinting, ...).

__Note__ For operations using a filter session in MP4Box, it is possible to view some information about the filter session:

- -fstat: this will print the statistics per filter and per PID of the session
- -fgraph: this will print the connections between the filters in the session

Expand Down
13 changes: 13 additions & 0 deletions docs/Howtos/avmix_tuto.md
Original file line number Diff line number Diff line change
Expand Up @@ -105,6 +105,7 @@ _Note_
A sequence not attached with a scene will not be visible nor played, even if active.

Now let's add :

- a logo
- a bottom rectangle with a gradient
- some text
Expand All @@ -131,6 +132,7 @@ In the following examples, we always use [relative coordinates system](avmix#coo
## Animating a scene

Scenes can be animated through timer objects providing value interpolation instructions. A timer provides:

- a start time, stop time and a loop count
- a duration for the interpolation period
- a set of animation values and their targets
Expand Down Expand Up @@ -167,6 +169,7 @@ It can be tedious to apply the same transformation (matrix, active, ...) on a su
The simplest way to do this is to group scenes together, and transform the group.

The following animates:

- the video from 90% to 100% , sticking it to the top-left corner and animated the rounded rectangle effect
- the overlay group position from visible to hidden past the bottom-right corner

Expand Down Expand Up @@ -270,6 +273,7 @@ This works with video scenes too:

You will at some point need to chain some videos. AVMix handles this through `sequence` objects describing how sources are to be chained.
Sequences are designed to:

- take care of media prefetching to reduce loading times
- perform transitions between sources, activating / prefetching based on the desired transition duration

Expand Down Expand Up @@ -297,6 +301,7 @@ AVMix handles this by allowing scenes to use more than one sequence as input, an
_Note: Currently, defined scenes only support 0, 1 or 2 input sequences_

This is done at scene declaration through:

- a `mix` object, describing a transition
- a `mix_ratio` property, describing the transition ratio

Expand Down Expand Up @@ -347,6 +352,7 @@ Specifying an identifier on the sequence avoids that.
## Live mode

Live mode works like offline mode, with the following additions:

- detection and display of signal lost or no input sequences
- `sequence` and `timer` start and stop time can be expressed as UTC dates (absolute) or current UTC offset

Expand All @@ -368,6 +374,7 @@ You should now see "no input" message when playing. Without closing the player,
]
```
And the video sequence will start ! You can use for start and stop time values:

- "now": will resolve to current UTC time
- integer: will resolve to current UTC time plus the number of seconds specified by the integer
- date: will use the date as the start/stop time
Expand Down Expand Up @@ -448,13 +455,15 @@ This is problematic if you use AVMix to generate a live feed supposed to be up 2
To prevent this, the filter allows launching the sources as dedicated child processes. When the child process exits unexpectedly, or when source data is no longer received, the filter can then kill and relaunch the child process.

There are three supported methods for this:

- running a gpac instance over a pipe
- running a gpac instance over TCP
- running any other process capable of communicating with gpac

The declaration is done at the `sourceURL` level through the port option.

For each of these mode, the `keep_alive` option is used to decide if the child process shall be restarted:

- if no more data is received after `rtimeout`.
- stream is in end of stream but child process exited with an error code greater than 2.

Expand Down Expand Up @@ -598,6 +607,7 @@ return 0;
```

Your module can also control the playlist through several functions:

- remove_element(id_or_elem): removes a scene, group or sequence from playlist
- parse_element(JSON_obj): parses a root JSON object and add to the playlist
- parse_scene(JSON_obj, parent_group): parses a scene from its JSON object and add it to parent_group, or to root if parent_group is null
Expand Down Expand Up @@ -729,12 +739,14 @@ In this mode, the texturing parameters used by the offscreen group can be modifi
AVMix can use a global alpha mask (covering the entire output frame) for draw operations, through the [mask](avmix#scene-mask) scene module.

This differs from using an offscreen group as an alpha operand input to [shape](avmix#scene-shape) as discussed above as follows:

- the mask is global and not subject to any transformation
- the mask is always cleared at the beginning of a frame
- the mask is only one alpha channel
- the mask operations can be accumulated between draws

The following example shows using a mask in regular mode:

- enable and clear mask
- draw a circle with alpha 0.4
- use mask and draw video, which will be blended only where the circle was drawn using alpha= 0.4
Expand Down Expand Up @@ -768,6 +780,7 @@ The following example shows using a mask in regular mode:
The mask can also be updated while drawing using a record mode. In this mode, the mask acts as a binary filter, any pixel drawn to the mask will no longer get drawn.

The following draws:

- an ellipse with first video at half opacity, appearing blended on the background
- the entire second video at full opacity, which will only appear where mask was not set

Expand Down
4 changes: 4 additions & 0 deletions docs/Howtos/dash/DASH-intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,11 +5,13 @@ GPAC has extended support for MPEG-DASH and HLS content generation and playback.
Basics concepts and terminology of MPEG-DASH are explained [here](DASH-basics) and, and the same terms are usually used in GPAC for both DASH and HLS.

For more information on content generation:

- read MP4Box [DASH options](mp4box-dash-opts)
- read the [dasher](dasher) filter help
- check the dash and HLS scripts in the GPAC [test suite](https://github.com/gpac/testsuite/tree/filters/scripts)

For more information on content playback:

- read the [dashin](dashin) filter help, used whenever a DASH or HLS session is read.
- check the dash and HLS scripts in the GPAC [test suite](https://github.com/gpac/testsuite/tree/filters/scripts)

Expand All @@ -19,6 +21,7 @@ If you generate your content with an third-party application such as ffmpeg, mak
When using GPAC, this is usually ensure by using the `fintra` option.

GPAC can be used to generate both static and live DASH/HLS content. For live cases, GPAC can expose the created files:

- directly through disk
- through its own HTTP server
- by pushing them to a remote HTTP server
Expand All @@ -28,6 +31,7 @@ We recommend reading the [HTTP server](httpout) filter help, and looking at the

## Content Playback
GPAC comes with a various set of adaptation algorithms:

- BBA0, BOLA, basic throughput (called `conventional` in the literature)
- Custom throughput-based (`gbuf`) and buffer-based (`grate`) algorithms

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@ Segmentation (`-dash`) is the process of creating segments, parts of an original
Last, MP4Box can split (-split) a file and create individual playable files from an original one. It does not use segmentation in the above sense, it removes fragmentation and can use interleaving.

Some examples of MP4Box usages:

- Rewrites a file with an interleaving window of 1 sec.

`MP4Box -inter 1000 file.mp4`
Expand Down
2 changes: 2 additions & 0 deletions docs/Howtos/dash/HAS-advanced.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,7 @@ Record the session in fragmented MP4
gpac -i $HAS_URL -o grab/record.mp4:frag
```
Note that we specify [frag](mp4mx#store) option for the generated MP4 so that:

- we don't have a long multiplexing process at the end
- if anything goes wrong (crash / battery dead / ...), we still have a file containing all media until the last written fragment.

Expand Down Expand Up @@ -80,6 +81,7 @@ gpac -i $HAS_URL dashin:forward=file -o route://225.1.1.0:6000

The [DASH reader](dashin) can be configured through [-forward](dashin#forward) to insert segment boundaries in the media pipeline - see [here](dashin#segment-bound-modes) for more details.
Two variants of this mode exist:

- `segb`: this enables `split_as`, DASH cue insertion (segment start signal) and fragment bounds signalling
- `mani`: same as `segb` and also forward manifests (MPD, M3U8) as packet properties.

Expand Down
4 changes: 4 additions & 0 deletions docs/Howtos/dash/HEVC-Tile-based-adaptation-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -75,10 +75,12 @@ You can now playback your MPD using GPAC, and have fun with the different adapta
## Live setup

If you want to produce a live feed of tiled video, you can either:

- produce short segments, package them and dash them using `-dash-live`, `dash-ctx` and `-subdur`, see discussion [here](https://github.com/gpac/gpac/issues/1648)
- produce a live session with a [tilesplit](tilesplit) filter.

GPAC does not have a direct wrapper for Kvazaar, but you can either:

- use a FFmpeg build with Kvazaar enabled (`--enable-libkvazaar` in ffmpeg configure) - check GPAC support using `gpac -h ffenc:libkvazaar`
- use an external grab+Kvazaar encoding and pipe its output into GPAC.

Expand Down Expand Up @@ -134,6 +136,7 @@ gpac


The resulting filter graph is quite fun (use `-graph` to check it) and shows:

- only one (or 0 depending on your webcam formats) pixel converter filter is used in the chain to feed both Kvazaar instances
- all tile PIDs (and only them) connecting to the dasher filter
- 21 output PIDs of the dasher: one for MPD, 2 x (1+3x3) media PIDs.
Expand Down Expand Up @@ -179,6 +182,7 @@ In 2D playback, the tile adaptation logic (for ROI for example) is controlled b

The compositor can use gaze information to automatically decrease the quality of the tiles not below the gaze.
The gaze information can be:

- emulated via mouse using [--sgaze](compositor#sgaze) option.
- signaled through filter updates on the [gazer_enabled](compositor#gazer_enabled) [gaze_x](compositor#gaze_x) [gaze_y](compositor#gaze_y)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -68,6 +68,7 @@ Check the [HEVC Tile-based adaptation guide](HEVC-Tile-multi-resolution-adaptati
# Content Playback

The logic of content playback is as follows:

- the MPD indicates SRD information and a GPAC extension for mergeable bitstream
- when the compositor is used, the [hevcmerge](hevcmerge) filter is automatically created to reassemble the streams
- otherwise (using vout), each PID is declared as an alternative to the other
Expand Down
2 changes: 2 additions & 0 deletions docs/Howtos/dash/LL-DASH.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ And when using gpac, you can enable real-time reporting of filters activities us


The `gpac` application can be used for dashing whenever `MP4Box` is used, but the opposite is not true. Especially MP4Box cannot:

- use complex custom filter chains while dashing, such as transcoding in several qualities
- produce two DASH sessions at the same time

Expand Down Expand Up @@ -169,6 +170,7 @@ gpac -i source1 -i source2 reframer:rt=on -o http://ORIG_SERVER_IP_PORT/live.mpd

We will now use a live source (webcam), encode it in two qualities, DASH the result and push it to a remote server. Please check the [encoding howto](encoding) first.
Compared to what we have seen previously, we only need to modify the input part of the graph:

- take as a live source the default audio video grabbed by the [libavdevice](ffavin) filter
- rescale the video as 1080p and 720p
- encode the rescaled videos at 6 and 3 mbps
Expand Down
1 change: 1 addition & 0 deletions docs/Howtos/dash/LL-HLS.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@ In this howto, we will study various setups for HLS live streaming in low latenc
The same setup for configuring segments and CMAF chunks is used as the [DASH low latency](LL-DASH#dash-low-latency-setup) setup.

When you have low-latency producing of your HLS media segments, you need to indicate to the client how to access LL-HLS `parts` (CMAF chunks) while they are produced. LL-HLS offers two possibilities to describe these parts in the manifest:

- file mode: advertise the chunks as dedicated files, i.e. each chunk will create its own file. This requires double storage for segments close to the live edge, increases disk IOs and might not be very practical if you setup a PUSH origin (twice the bandwidth is required)
- byte range mode: advertise the chunks as byte range of a media file. If that media file is the full segment being produced (usually the case), this does not induce bandwidth increase or extra disk IOs.

Expand Down
2 changes: 2 additions & 0 deletions docs/Howtos/dash/cmaf.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ GPAC can be used to generate DASH or HLS following the CMAF specification.

CMAF defines two structural brands `cmfc`and `cmf2` for ISOBMFF-segmented content.
The `cmfc` brand constraints:

- some default values in ISOBMFF boxes
- a single media per file
- a single track fragment per movie fragment (`moof`)
Expand All @@ -16,6 +17,7 @@ The `cmfc` brand constraints:


The `cmf2`brand further restrict the `cmfc` brand for video tracks:

- no edit list shall be used
- negative composition offset (`trun` version 1) shall be used
- sample default values shall be repeated in each track fragment
Expand Down
1 change: 1 addition & 0 deletions docs/Howtos/dash/dash_transcoding.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,7 @@ In this howto, we will study various setups for DASH transcoding.
Please make sure you are familiar with [DASH terminology](DASH-basics) before reading.

It is likely that your source media is not properly encoded for DASH or HLS delivery, most likely because:

- openGOPs are used
- key-frame position do not match between your different qualities
- key-frame intervals are not constant
Expand Down
1 change: 1 addition & 0 deletions docs/Howtos/dash/hls.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,6 +37,7 @@ This will generate `live.m3u8`, `video.m3u8` and `audio.m3u8`
# Renditions
## Grouping
When several renditions are possible for a set of inputs, the default behavior is as follows:

- if video is present, it is used as the main content
- otherwise, audio is used as the main content

Expand Down
2 changes: 2 additions & 0 deletions docs/Howtos/dynamic_rc.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ In this example we will use RTP as delivery mechanism and monitor loss rate of c
## RTP reader

The reader is a regular video playback from RTP (using SDP as input). We will:

- locate the `rtpin` filter in the chain, i.e. the first filter after the `fin`filter used for SDP access
- update every 2 second the `loss_rate`option of the `rtpin` filter: this will force the loss ratio in RTCP Receiver Reports, but will not drop any packet at the receiver side

Expand Down Expand Up @@ -77,6 +78,7 @@ gpac.close()
## Encoder and RTP sender

The encoder consists in a source (here a single video file playing in loop), an AVC encoder and an RTP output. We will:

- locate the `rtpout` filter in the chain, i.e. the first filter before the `fout` filter used for SDP output
- monitor every 2 second the statistics of the input PID of `rtpout` to get the real-time measurements reported by RTCP
- adjust encoder max rate based on the percentage of loss packets
Expand Down
1 change: 1 addition & 0 deletions docs/Howtos/encoding.md
Original file line number Diff line number Diff line change
Expand Up @@ -67,6 +67,7 @@ The above command will encode the video track in `source.mp4` into AVC|H264 at
```gpac -i source.mp4 c=avc::x264-params=no-mbtree:sync-lookahead=0::profile=baseline -o test.avc```

The above command will encode the video track in `source.mp4` into AVC|H264 and pass two options to ffmpeg encoder:

- `x264-params`, with value `no-mbtree:sync-lookahead=0`
- `profile`, with value `baseline`

Expand Down
1 change: 1 addition & 0 deletions docs/Howtos/encryption/encryption-filters.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,7 @@ Another possibility is to define the `CryptInfo` PID property rather than using
gpac -i udp://localhost:1234/:#CrypTrack=(audio)drm_audio.xml,(video)drm_video.xml cecrypt -o dest.mpd:profile=live:dmode=dynamic
```
This example assigns:

- a `CryptInfo` property to `drm_audio.xml` for PIDs of type audio
- a `CryptInfo` property to `drm_video.xml` for PIDs of type video
- no `CryptInfo` property for other PIDs
Expand Down
4 changes: 4 additions & 0 deletions docs/Howtos/filters-oneliners.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,12 @@
# Foreword

This page contains one-liners illustrating the many possibilities of GPAC filters architecture. For a more detailed information, it is highly recommended that you read:

- the [general concepts](filters_general) page
- the [gpac application](gpac_general) help

To get a better understanding of each command illustrated in this case, it is recommended to:

- run the same command with `-graph` specified to see the filter graph associated
- read the help of the different filters in this graph using `gpac -h filter_name`

Expand All @@ -13,6 +15,7 @@ Whenever an option is specified, e.g. `dest.mp4:foo`, you can get more info and
The filter session is by default quiet, except for warnings and error reporting. To get information on the session while running, use [-r](gpac_general#r) option. To get more runtime information, use the [log system](core_logs).

Given the configurable nature of the filter architecture, most examples given in one context can be reused in another context. For example:

- from the dump examples:
```
gpac -i source reframer:saps=1 -o dump/$num$.png
Expand All @@ -34,6 +37,7 @@ _NOTE The command lines given here are usually using a local file for source or
_Reminder_
Most filters are never specified at the prompt, they are dynamically loaded during the graph resolution.
GPAC filters can use either:

- global options, e.g. `--foo`, applying to each instance of any filter defining the `foo` option,
- local options to a given filter and any filters dynamically loaded, e.g. `:foo`. This is called [argument inheriting](filters_general#arguments-inheriting).

Expand Down
Loading

0 comments on commit 5acd190

Please sign in to comment.