diff --git a/extern/sugar b/extern/sugar
deleted file mode 160000
index eff7696..0000000
--- a/extern/sugar
+++ /dev/null
@@ -1 +0,0 @@
-Subproject commit eff76967001a3b5d3620a0d2510fb815ee36721e
diff --git a/extern/sugar/.gitignore b/extern/sugar/.gitignore
new file mode 100644
index 0000000..6d908e2
--- /dev/null
+++ b/extern/sugar/.gitignore
@@ -0,0 +1,144 @@
+*.pt
+*.pth
+output*
+*.slurm
+*.pyc
+*.ply
+*.obj
+sugar_tests.ipynb
+sugar_sh_tests.ipynb
+
+# Byte-compiled / optimized / DLL files
+__pycache__/
+*.py[cod]
+*$py.class
+
+# C extensions
+*.so
+
+# Distribution / packaging
+.Python
+build/
+develop-eggs/
+dist/
+downloads/
+eggs/
+.eggs/
+lib/
+lib64/
+parts/
+sdist/
+var/
+wheels/
+pip-wheel-metadata/
+share/python-wheels/
+*.egg-info/
+.installed.cfg
+*.egg
+MANIFEST
+
+# PyInstaller
+# Usually these files are written by a python script from a template
+# before PyInstaller builds the exe, so as to inject date/other infos into it.
+*.manifest
+*.spec
+
+# Installer logs
+pip-log.txt
+pip-delete-this-directory.txt
+
+# Unit test / coverage reports
+htmlcov/
+.tox/
+.nox/
+.coverage
+.coverage.*
+.cache
+nosetests.xml
+coverage.xml
+*.cover
+*.py,cover
+.hypothesis/
+.pytest_cache/
+
+# Translations
+*.mo
+*.pot
+
+# Django stuff:
+*.log
+local_settings.py
+db.sqlite3
+db.sqlite3-journal
+
+# Flask stuff:
+instance/
+.webassets-cache
+
+# Scrapy stuff:
+.scrapy
+
+# Sphinx documentation
+docs/_build/
+
+# PyBuilder
+target/
+
+# Jupyter Notebook
+.ipynb_checkpoints
+
+# IPython
+profile_default/
+ipython_config.py
+
+# pyenv
+.python-version
+
+# pipenv
+# According to pypa/pipenv#598, it is recommended to include Pipfile.lock in version control.
+# However, in case of collaboration, if having platform-specific dependencies or dependencies
+# having no cross-platform support, pipenv may install dependencies that don't work, or not
+# install all needed dependencies.
+#Pipfile.lock
+
+# PEP 582; used by e.g. github.com/David-OConnor/pyflow
+__pypackages__/
+
+# Celery stuff
+celerybeat-schedule
+celerybeat.pid
+
+# SageMath parsed files
+*.sage.py
+
+# Environments
+.env
+.venv
+env/
+venv/
+ENV/
+env.bak/
+venv.bak/
+
+# Spyder project settings
+.spyderproject
+.spyproject
+
+# Rope project settings
+.ropeproject
+
+# mkdocs documentation
+/site
+
+# mypy
+.mypy_cache/
+.dmypy.json
+dmypy.json
+
+# Pyre type checker
+.pyre/
+learnableearthparser/fast_sampler/_sampler.c
+
+.idea
+
+resource/
\ No newline at end of file
diff --git a/extern/sugar/LICENSE.md b/extern/sugar/LICENSE.md
new file mode 100644
index 0000000..c869e69
--- /dev/null
+++ b/extern/sugar/LICENSE.md
@@ -0,0 +1,83 @@
+Gaussian-Splatting License
+===========================
+
+**Inria** and **the Max Planck Institut for Informatik (MPII)** hold all the ownership rights on the *Software* named **gaussian-splatting**.
+The *Software* is in the process of being registered with the Agence pour la Protection des
+Programmes (APP).
+
+The *Software* is still being developed by the *Licensor*.
+
+*Licensor*'s goal is to allow the research community to use, test and evaluate
+the *Software*.
+
+## 1. Definitions
+
+*Licensee* means any person or entity that uses the *Software* and distributes
+its *Work*.
+
+*Licensor* means the owners of the *Software*, i.e Inria and MPII
+
+*Software* means the original work of authorship made available under this
+License ie gaussian-splatting.
+
+*Work* means the *Software* and any additions to or derivative works of the
+*Software* that are made available under this License.
+
+
+## 2. Purpose
+This license is intended to define the rights granted to the *Licensee* by
+Licensors under the *Software*.
+
+## 3. Rights granted
+
+For the above reasons Licensors have decided to distribute the *Software*.
+Licensors grant non-exclusive rights to use the *Software* for research purposes
+to research users (both academic and industrial), free of charge, without right
+to sublicense.. The *Software* may be used "non-commercially", i.e., for research
+and/or evaluation purposes only.
+
+Subject to the terms and conditions of this License, you are granted a
+non-exclusive, royalty-free, license to reproduce, prepare derivative works of,
+publicly display, publicly perform and distribute its *Work* and any resulting
+derivative works in any form.
+
+## 4. Limitations
+
+**4.1 Redistribution.** You may reproduce or distribute the *Work* only if (a) you do
+so under this License, (b) you include a complete copy of this License with
+your distribution, and (c) you retain without modification any copyright,
+patent, trademark, or attribution notices that are present in the *Work*.
+
+**4.2 Derivative Works.** You may specify that additional or different terms apply
+to the use, reproduction, and distribution of your derivative works of the *Work*
+("Your Terms") only if (a) Your Terms provide that the use limitation in
+Section 2 applies to your derivative works, and (b) you identify the specific
+derivative works that are subject to Your Terms. Notwithstanding Your Terms,
+this License (including the redistribution requirements in Section 3.1) will
+continue to apply to the *Work* itself.
+
+**4.3** Any other use without of prior consent of Licensors is prohibited. Research
+users explicitly acknowledge having received from Licensors all information
+allowing to appreciate the adequacy between of the *Software* and their needs and
+to undertake all necessary precautions for its execution and use.
+
+**4.4** The *Software* is provided both as a compiled library file and as source
+code. In case of using the *Software* for a publication or other results obtained
+through the use of the *Software*, users are strongly encouraged to cite the
+corresponding publications as explained in the documentation of the *Software*.
+
+## 5. Disclaimer
+
+THE USER CANNOT USE, EXPLOIT OR DISTRIBUTE THE *SOFTWARE* FOR COMMERCIAL PURPOSES
+WITHOUT PRIOR AND EXPLICIT CONSENT OF LICENSORS. YOU MUST CONTACT INRIA FOR ANY
+UNAUTHORIZED USE: stip-sophia.transfert@inria.fr . ANY SUCH ACTION WILL
+CONSTITUTE A FORGERY. THIS *SOFTWARE* IS PROVIDED "AS IS" WITHOUT ANY WARRANTIES
+OF ANY NATURE AND ANY EXPRESS OR IMPLIED WARRANTIES, WITH REGARDS TO COMMERCIAL
+USE, PROFESSIONNAL USE, LEGAL OR NOT, OR OTHER, OR COMMERCIALISATION OR
+ADAPTATION. UNLESS EXPLICITLY PROVIDED BY LAW, IN NO EVENT, SHALL INRIA OR THE
+AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE
+GOODS OR SERVICES, LOSS OF USE, DATA, OR PROFITS OR BUSINESS INTERRUPTION)
+HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
+LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING FROM, OUT OF OR
+IN CONNECTION WITH THE *SOFTWARE* OR THE USE OR OTHER DEALINGS IN THE *SOFTWARE*.
diff --git a/extern/sugar/README.md b/extern/sugar/README.md
new file mode 100644
index 0000000..470f4b1
--- /dev/null
+++ b/extern/sugar/README.md
@@ -0,0 +1,465 @@
+
+
+# SuGaR: Surface-Aligned Gaussian Splatting for Efficient 3D Mesh Reconstruction and High-Quality Mesh Rendering
+
+
+Antoine Guédon
+Vincent Lepetit
+
+
+
+
+LIGM, Ecole des Ponts, Univ Gustave Eiffel, CNRS
+
+
+| Webpage | arXiv | Presentation video | Viewer video |
+
+
+Our method extracts meshes from 3D Gaussian Splatting reconstructions and builds hybrid representations that enable easy composition and animation in Gaussian Splatting scenes by manipulating the mesh.
+
+
+## Abstract
+
+_We propose a method to allow precise and extremely fast mesh extraction from 3D Gaussian Splatting (SIGGRAPH 2023).
+Gaussian Splatting has recently become very popular as it yields realistic rendering while being significantly faster to train than NeRFs. It is however challenging to extract a mesh from the millions of tiny 3D Gaussians as these Gaussians tend to be unorganized after optimization and no method has been proposed so far.
+Our first key contribution is a regularization term that encourages the 3D Gaussians to align well with the surface of the scene.
+We then introduce a method that exploits this alignment to sample points on the real surface of the scene and extract a mesh from the Gaussians using Poisson reconstruction, which is fast, scalable, and preserves details, in contrast to the Marching Cubes algorithm usually applied to extract meshes from Neural SDFs.
+Finally, we introduce an optional refinement strategy that binds Gaussians to the surface of the mesh, and jointly optimizes these Gaussians and the mesh through Gaussian splatting rendering. This enables easy editing, sculpting, rigging, animating, or relighting of the Gaussians using traditional softwares (Blender, Unity, Unreal Engine, etc.) by manipulating the mesh instead of the Gaussians themselves.
+Retrieving such an editable mesh for realistic rendering is done within minutes with our method, compared to hours with the state-of-the-art method on neural SDFs, while providing a better rendering quality in terms of PSNR, SSIM and LPIPS._
+
+
+Hybrid representation (Mesh + Gaussians on the surface)
+
+
+
+
+
+
+Underlying mesh without texture
+
+
+
+
+
+
+
+
+
+## BibTeX
+
+```
+@article{guedon2023sugar,
+ title={SuGaR: Surface-Aligned Gaussian Splatting for Efficient 3D Mesh Reconstruction and High-Quality Mesh Rendering},
+ author={Gu{\'e}don, Antoine and Lepetit, Vincent},
+ journal={arXiv preprint arXiv:2311.12775},
+ year={2023}
+}
+```
+
+## Updates and To-do list
+
+
+Updates
+
+
[01/09/2024] Added a dedicated, real-time viewer to let users visualize and navigate in the reconstructed scenes (hybrid representation, textured mesh and wireframe mesh).
+
[12/20/2023] Added a short notebook showing how to render images with the hybrid representation using the Gaussian Splatting rasterizer.
+
[12/18/2023] Code release.
+
+
+
+
+To-do list
+
+
Viewer: Add option to load the postprocessed mesh.
+
Mesh extraction: Add the possibility to edit the extent of the background bounding box.
+
Tips&Tricks: Add to the README.md file (and the webpage) some tips and tricks for using SuGaR on your own data and obtain better reconstructions (see the tips provided by user kitmallet).
+
Improvement: Add an if block to sugar_extractors/coarse_mesh.py to skip foreground mesh reconstruction and avoid triggering an error if no surface point is detected inside the foreground bounding box. This can be useful for users that want to reconstruct "background scenes".
+
Using precomputed masks with SuGaR: Add a mask functionality to the SuGaR optimization, to allow the user to mask out some pixels in the training images (like white backgrounds in synthetic datasets).
+
+
Using SuGaR with Windows: Adapt the code to make it compatible with Windows. Due to path-writing conventions, the current code is not compatible with Windows.
+
+
Synthetic datasets: Add the possibility to use the NeRF synthetic dataset (which has a different format than COLMAP scenes)
+
+
Composition and animation: Finish to clean the code for composition and animation, and add it to the sugar_scene/sugar_compositor.py script.
+
+
Composition and animation: Make a tutorial on how to use the scripts in the blender directory and the sugar_scene/sugar_compositor.py class to import composition and animation data into PyTorch and apply it to the SuGaR hybrid representation.
+
+
+
+
+
+## Overview
+
+As we explain in the paper, SuGaR optimization starts with first optimizing a 3D Gaussian Splatting model for 7k iterations with no additional regularization term.
+In this sense, SuGaR is a method that can be applied on top of any 3D Gaussian Splatting model, and a Gaussian Splatting model optimized for 7k iterations must be provided to SuGaR.
+
+Consequently, the current implementation contains a version of the original 3D Gaussian Splatting code, and we built our model as a wrapper of a vanilla 3D Gaussian Splatting model.
+Please note that, even though this wrapper implementation is convenient for many reasons, it may not be the most optimal one for memory usage, so we might change it in the future.
+
+After optimizing a vanilla Gaussian Splatting model, the SuGaR pipeline consists of 3 main steps, and an optional one:
+1. **SuGaR optimization**: optimizing Gaussians alignment with the surface of the scene
+2. **Mesh extraction**: extracting a mesh from the optimized Gaussians
+3. **SuGaR refinement**: refining the Gaussians and the mesh together to build a hybrid representation
+4. **Textured mesh extraction (Optional)**: extracting a traditional textured mesh from the refined SuGaR model
+
+
+
+
+
+
+We provide a dedicated script for each of these steps, as well as a script `train.py` that runs the entire pipeline. We explain how to use this script in the next sections.
+
+Please note that the final step, _Textured mesh extraction_, is optional but is enabled by default in the `train.py` script. Indeed, it is very convenient to have a traditional textured mesh for visualization, composition and animation using traditional softwares such as Blender. However, this step is not needed to produce, modify or animate hybrid representations.
+
+
+Hybrid representation (Mesh + Gaussians on the surface)
+
+
+
+
+Underlying mesh with a traditional colored UV texture
+
+
+
+
+
+
+Below is another example of a scene showing a robot with a black and specular material. The following images display the hybrid representation (Mesh + Gaussians on the surface), the mesh with a traditional colored UV texture, and a depth map of the mesh:
+
+Hybrid representation - Textured mesh - Depth map of the mesh
+
+
+
+
+
+## Installation
+
+### 0. Requirements
+
+The software requirements are the following:
+- Conda (recommended for easy setup)
+- C++ Compiler for PyTorch extensions
+- CUDA toolkit 11.8 for PyTorch extensions
+- C++ Compiler and CUDA SDK must be compatible
+
+Please refer to the original 3D Gaussian Splatting repository for more details about requirements.
+
+### 1. Clone the repository
+
+Start by cloning this repository:
+
+```shell
+# HTTPS
+git clone https://github.com/Anttwo/SuGaR.git --recursive
+```
+
+or
+
+```shell
+# SSH
+git clone git@github.com:Anttwo/SuGaR.git --recursive
+```
+
+### 2. Install the required Python packages
+To install the required Python packages and activate the environment, go inside the `SuGaR/` directory and run the following commands:
+
+```shell
+conda env create -f environment.yml
+conda activate sugar
+```
+
+
+If this command fails to create a working environment
+
+Then you can try to install the required packages manually by running the following commands:
+```shell
+conda create --name sugar -y python=3.9
+conda activate sugar
+conda install pytorch==2.0.1 torchvision==0.15.2 torchaudio==2.0.2 pytorch-cuda=11.8 -c pytorch -c nvidia
+conda install -c fvcore -c iopath -c conda-forge fvcore iopath
+conda install pytorch3d==0.7.4 -c pytorch3d
+conda install -c plotly plotly
+conda install -c conda-forge rich
+conda install -c conda-forge plyfile==0.8.1
+conda install -c conda-forge jupyterlab
+conda install -c conda-forge nodejs
+conda install -c conda-forge ipywidgets
+pip install open3d
+pip install --upgrade PyMCubes
+```
+
+
+### 3. Install the Gaussian Splatting rasterizer
+
+Run the following commands inside the sugar directory to install the additional Python submodules required for Gaussian Splatting:
+
+```shell
+cd gaussian_splatting/submodules/diff-gaussian-rasterization/
+pip install -e .
+cd ../simple-knn/
+pip install -e .
+cd ../../../
+```
+Please refer to the 3D Gaussian Splatting repository for more details.
+
+
+## Quick Start
+
+Start by optimizing a vanilla Gaussian Splatting model for 7k iterations by running the script `gaussian_splatting/train.py`, as shown below. Please refer to the original 3D Gaussian Splatting repository for more details. This optimization should be very fast, and last only a few minutes.
+
+```shell
+python gaussian_splatting/train.py -s --iterations 7000 -m
+```
+
+Then, run the script `train.py` in the root directory to optimize a SuGaR model.
+
+```shell
+python train.py -s -c -r <"density" or "sdf">
+```
+
+The most important arguments for the `train.py` script are the following:
+| Parameter | Type | Description |
+| :-------: | :--: | :---------: |
+| `--scene_path` / `-s` | `str` | Path to the source directory containing a COLMAP dataset.|
+| `--checkpoint_path` / `-c` | `str` | Path to the checkpoint directory of the vanilla 3D Gaussian Splatting model. |
+| `--regularization_type` / `-r` | `str` | Type of regularization to use for optimizing SuGaR. Can be `"density"` or `"sdf"`. For reconstructing detailed objects centered in the scene with 360° coverage, `"density"` provides a better foreground mesh. For a stronger regularization and a better balance between foreground and background, choose `"sdf"`. |
+| `--eval` | `bool` | If True, performs an evaluation split of the training images. Default is `True`. |
+| `--low_poly` | `bool` | If True, uses the standard config for a low poly mesh, with `200_000` vertices and `6` Gaussians per triangle. |
+| `--high_poly` | `bool` | If True, uses the standard config for a high poly mesh, with `1_000_000` vertices and `1` Gaussian per triangle. |
+| `--refinement_time` | `str` | Default configs for time to spend on refinement. Can be `"short"` (2k iterations), `"medium"` (7k iterations) or `"long"` (15k iterations). |
+| `--export_uv_textured_mesh` / `-t` | `bool` | If True, will optimize and export a traditional textured mesh as an `.obj` file from the refined SuGaR model, after refinement. Computing a traditional color UV texture should take less than 10 minutes. Default is `True`. |
+| `--export_ply` | `bool` | If True, export a `.ply` file with the refined 3D Gaussians at the end of the training. This file can be large (+/- 500MB), but is needed for using the dedicated viewer. Default is `True`. |
+
+We provide more details about the two regularization methods `"density"` and `"sdf"` in the next section. For reconstructing detailed objects centered in the scene with 360° coverage, `"density"` provides a better foreground mesh. For a stronger regularization and a better balance between foreground and background, choose `"sdf"`.
+The default configuration is `high_poly` with `refinement_time` set to `"long"`. Results are saved in the `output/` directory.
+
+As we explain in the paper, this script extracts a mesh in 30~35 minutes on average on a single GPU. After mesh extraction, the refinement time only takes a few minutes when using `--refinement_time "short"`, but can take up to an hour when using `--refinement_time "long"`. A short refinement time is enough to produce a good-looking hybrid representation in most cases.
+
+Please note that the optimization time may vary (from 20 to 45 minutes) depending on the complexity of the scene and the GPU used. Moreover, the current implementation splits the optimization into 3 scripts that can be run separately (SuGaR optimization, mesh extraction, model refinement) so it reloads the data at each part, which is not optimal and takes several minutes. We will update the code in a near future to optimize this.
+
+Below is a detailed list of all the command line arguments for the `train.py` script.
+
+All command line arguments for train.py
+
+#### Data and initial 3D Gaussian Splatting optimization
+
+| Parameter | Type | Description |
+| :-------: | :--: | :---------: |
+| `--scene_path` / `-s` | `str` | Path to the source directory containing a COLMAP data set.|
+| `--checkpoint_path` / `-c` | `str` | Path to the checkpoint directory of the vanilla 3D Gaussian Splatting model. |
+| `--iteration_to_load` / `-i` | `int` | Iteration to load from the 3DGS checkpoint directory. If not specified, loads the iteration `7000`. |
+| `--eval` | `bool` | If True, performs an evaluation split of the training images. Default is `True`. |
+
+#### SuGaR optimization
+| Parameter | Type | Description |
+| :-------: | :--: | :---------: |
+| `--regularization_type` / `-r` | `str` | Type of regularization to use for optimizing SuGaR. Can be `"density"` or `"sdf"`. |
+| `--gpu` | `int` | Index of GPU device to use. Default is `0`. |
+
+#### Mesh extraction
+
+| Parameter | Type | Description |
+| :-------: | :--: | :---------: |
+| `--surface_level` / `-l` |`int`| Surface level to extract the mesh at. Default is `0.3`. |
+| `--n_vertices_in_mesh` / `-v` | `int` | Number of vertices in the extracted mesh. Default is `1_000_000`. |
+| `--bboxmin` / `-b` | `str` | Min coordinates to use for foreground bounding box, formatted as a string `"(x,y,z)"`.|
+| `--bboxmax` / `-B` | `str` | Max coordinates to use for foreground bounding box, formatted as a string `"(x,y,z)"`. |
+| `--center_bbox` | `bool` | If True, centers the bbox. Default is True. |
+
+#### SuGaR and mesh refinement (Hybrid representation)
+
+| Parameter | Type | Description |
+| :-------: | :--: | :---------: |
+| `--gaussians_per_triangle` / `-g` | `int` | Number of gaussians per triangle. Default is `1`. |
+| `--refinement_iterations` / `-f` | `int` | Number of refinement iterations. Default is `15_000`. |
+
+#### (Optional) Parameters for traditional textured mesh extraction
+
+| Parameter | Type | Description |
+| :-------: | :--: | :---------: |
+| `--export_uv_textured_mesh` / `-t` | `bool` | If True, will optimize and export a textured mesh as an .obj file from the refined SuGaR model. Computing a traditional colored UV texture should take less than 10 minutes. Default is `True`. |
+| `--square_size` | `int` | Size of the square to use for the UV texture. Default is `10`. |
+| `--postprocess_mesh` | `bool` | If True, postprocess the mesh by removing border triangles with low-density. This step takes a few minutes and is not needed in general, as it can also be risky. However, it increases the quality of the mesh in some cases, especially when very thin objects are visible only from one side in the images. Default is `False`. |
+| `--postprocess_density_threshold` | `float` | Threshold to use for postprocessing the mesh. Default is `0.1`. |
+| `--postprocess_iterations` | `int` | Number of iterations to use for postprocessing the mesh. Default is `5`. |
+
+#### (Optional) Parameters for exporting PLY files for the dedicated viewer
+
+| Parameter | Type | Description |
+| :-------: | :--: | :---------: |
+| `--export_ply` | `bool` | If True, export a `.ply` file with the refined 3D Gaussians at the end of the training. This file can be large (+/- 500MB), but is needed for using the dedicated viewer. Default is `True`. |
+
+#### (Optional) Default configurations
+
+| Parameter | Type | Description |
+| :-------: | :--: | :---------: |
+| `--low_poly` | `bool` | If True, uses standard config for a low poly mesh, with `200_000` vertices and `6` Gaussians per triangle. |
+| `--high_poly` | `bool` | If True, uses standard config for a high poly mesh, with `1_000_000` vertices and `1` Gaussians per triangle. |
+| `--refinement_time` | `str` | Default configs for time to spend on refinement. Can be `"short"` (2k iterations), `"medium"` (7k iterations) or `"long"` (15k iterations). |
+
+
+
+
+## Installing and using the real-time viewer
+
+Please find here a short video illustrating how to use the viewer.
+
+### 1. Installation
+
+*The viewer is currently built for Linux and Mac OS. It is not compatible with Windows. For Windows users, we recommend to use WSL2 (Windows Subsystem for Linux), as it is very easy to install and use. Please refer to the official documentation for more details. We thank Mark Kellogg for his awesome 3D Gaussian Splatting implementation for Three.js, which we used for building this viewer.*
+
+Please start by installing the latest versions of Node.js (such as 21.x) and npm.
+A simple way to do this is to run the following commands (using aptitude):
+
+```shell
+curl -fsSL https://deb.nodesource.com/setup_21.x | sudo -E bash -
+sudo apt-get install -y nodejs
+sudo apt-get install aptitude
+sudo aptitude install -y npm
+```
+
+Then, go inside the `./sugar_viewer/` directory and run the following commands:
+
+```shell
+npm install
+cd ..
+```
+
+### 2. Usage
+
+First, make sure you have exported a `.ply` file and an `.obj` file using the `train.py` script. The `.ply` file contains the refined 3D Gaussians, and the `.obj` file contains the textured mesh. These files are exported by default when running the `train.py` script, so if you ran the code with default values for `--export_ply` and `--export_uv_textured_mesh`, you should be good to go.
+
+The ply file should be located in `./output/refined_ply//`. Then, just run the following command in the root directory to start the viewer:
+
+```shell
+python run_viewer.py -p
+```
+Please make sure your `.ply` file is located in the right folder, and use a relative path starting with `./output/refined_ply`.
+This command will redirect you to a local URL. Click on the link to open the viewer in your browser. Click the icons on the top right to switch between the different representations (hybrid representation, textured mesh, wireframe mesh). Use the mouse to rotate the scene, and the mouse wheel to zoom in and out.
+
+
+
+
+
+## Tips for using SuGaR on your own data and obtain better reconstructions
+
+### 1. Capture images or videos that cover the entire surface of the scene
+
+Using a smartphone or a camera, capture images or a video that cover the entire surface of the 3D scene you want to reconstruct. The easiest way to do this is to move around the scene while recording a video. Try to move slowly and smoothly in order to avoid motion blur. For consistent reconstruction and easier camera pose estimation with COLMAP, maintaining a uniform focal length and a constant exposure time is also important. We recommend to disable auto-focus on your smartphone to ensure that the focal length remains constant.
+
+For better reconstructions, try to cover objects from several and different angles, especially for thin and detailed parts of the scene.
+Indeed, SuGaR is able to reconstruct very thin and detailed objects, but some artifacts may appear if these thin objects are not covered enough and are visible only from one side in the training images.
+
+
+Detailed explanation
+SuGaR applies Poisson reconstruction with 3D points sampled on the parts of the surface that are visible in the training images. This visibility constraint is important to prevent sampling points on the backside of the Gaussian level sets, located behind the surface of the scene, which would produce a lot of self-collisions and many unnecessary vertices in the mesh after applying Poisson reconstruction.
+However, this visibility constraint also means that SuGaR cannot reconstruct parts of the surface that are not visible in the training images. If thin objects are visible only from one side in the training images, the Poisson reconstruction will try to reconstruct a closed surface, and will extend the surface of the thin objects, which produces an inaccurate mesh.
+
+_TODO: Add images illustrating such artifacts._
+
+
+However, such artifacts are not visible in the hybrid representation, because the gaussian texturing gives low-opacity to these artifacts during refinement.
+
+We already have simple ideas that could help to avoid such artifacts, such as **(a)** identifying new camera poses that cover parts of the surface non-visible in the training images that are likely to be on the same level set as the visible parts, and **(b)** adding these camera poses to the set of cameras used for sampling the points when applying Poisson reconstruction. We will update the code in a near future to include this.
+
+To convert a video to images, you can install `ffmpeg` and run the following command:
+```shell
+ffmpeg -i -qscale:v 1 -qmin 1 -vf fps= %04d.jpg
+```
+where `` is the desired sampling rate of the video images. An FPS value of 1 corresponds to sampling one image per second. We recommend to adjust the sampling rate to the length of the video, so that the number of sampled images is between 100 and 300.
+
+### 2. Estimate camera poses with COLMAP
+
+Please first install a recent version of COLMAP (ideally CUDA-powered) and make sure to put the images you want to use in a directory `/input`. Then, run the script `gaussian_splatting/convert.py` from the original Gaussian splatting implementation to compute the camera poses from the images using COLMAP. Please refer to the original 3D Gaussian Splatting repository for more details.
+
+```shell
+python gaussian_splatting/convert.py -s
+```
+
+Sometimes COLMAP fails to reconstruct all images into the same model and hence produces multiple sub-models. The smaller sub-models generally contain only a few images. However, by default, the script `convert.py` will apply Image Undistortion only on the first sub-model, which may contain only a few images.
+
+If this is the case, a simple solution is to keep only the largest sub-model and discard the others. To do this, open the source directory containing your input images, then open the sub-directory `/distorted/sparse/`. You should see several sub-directories named `0/`, `1/`, etc., each containing a sub-model. Remove all sub-directories except the one containing the largest files, and rename it to `0/`. Then, run the script `convert.py` one more time but skip the matching process:
+
+```shell
+python gaussian_splatting/convert.py -s --skip_matching
+```
+
+_Note: If the sub-models have common registered images, they could be merged into a single model as post-processing step using COLMAP; However, merging sub-models requires to run another global bundle adjustment after the merge, which can be time consuming._
+
+
+### 3. Density or SDF? Choose a regularization method that fits your scene
+
+As we explain in the paper, we provide two separate regularization methods for SuGaR: a density regularization and an SDF regularization. The density regularization is the simplest one and works well with objects centered in the scene. The SDF provides a stronger regularization, especially in background regions.
+As a consequence, the SDF regularization produces higher metrics on standard datasets.
+However, for reconstructing an object centered in the scene with images taken from all around the object, the simpler density regularization generally produces a better mesh.
+
+Therefore, we recommend the following when using the `train.py` script:
+- For reconstructing detailed objects centered in the scene with 360° coverage (such as the toys we reconstructed in our presentation video), start with the density regularization `-r 'density'`. However, this may result in more chaotic Gaussians in the background.
+- For reconstructing more challenging scenes or enforcing a stronger regularization in the background, use the SDF regularization `-r 'sdf'`.
+
+### 4. I have holes in my mesh, what can I do?
+
+If you have holes in your mesh, this means the cleaning step of the Poisson mesh is too aggressive for your scene. You can reduce the treshold `vertices_density_quantile` used for cleaning by modifying line 43 of `sugar_extractors/coarse_mesh.py`. For example, you can change this line from
+```python
+ vertices_density_quantile = 0.1
+```
+to
+```python
+ vertices_density_quantile = 0.
+```
+
+### 5. I have messy ellipsoidal bumps on the surface of my mesh, what can I do?
+
+Depending on your scene, the default hyperparameters used for Poisson reconstruction may be too fine compared to the size of the Gaussians. Gaussian could then become visible on the mesh, which results in messy ellipsoidal bumps on the surface of the mesh.
+This could happen if the camera trajectory is very close to a simple foreground object, for example.
+To fix this, you can reduce the depth of Poisson reconstruction `poisson_depth` by modifying line 42 of `sugar_extractors/coarse_mesh.py`.
+For example, you can change line 42 from
+```python
+ poisson_depth = 10
+```
+to
+```python
+ poisson_depth = 7
+```
+You may also try `poisson_depth = 6`, or `poisson_depth = 8` if the result is not satisfying.
+
+### 6. (Optional) Adapt the scale and the bounding box of the scene
+
+As it is explained in the original 3D Gaussian Splatting repository, the method is expected to reconstruct a scene with reasonable scale. For reconstructing much larger datasets, like a city district, the original authors recommend to lower the learning rates of the positions and scaling factors of the Gaussians. The more extensive the scene, the lower these values should be.
+
+Concerning SuGaR, such learning rates should also be lowered when reconstructing a very large scene. Moreover, as we explain in the supplementary material of the paper, for extracting a mesh from the Gaussians with an optimal repartition of vertices, we apply two Poisson reconstructions in practice: one on _foreground_ Gaussians, and one on _background_ Gaussians. The foreground Gaussians are defined as the Gaussians located inside a predefined bounding box, and the background Gaussians are defined as the Gaussians located outside this bounding box.
+
+By default, this bounding box is computed as the bounding box of all camera centers. This general approach is coherent with how the original 3D Gaussian Splatting scales the learning rates. We used this default bounding box for all the reconstructions shown in the paper and the presentation video.
+
+However, this bounding box might not be optimal in very specific cases, especially when the user wants to reconstruct with high details a very specific object located somewhere in the scene, or if the scene is very large, or if the camera centers are very far from the scene.
+The user is free to provide a custom bounding box to the `train.py` script, using the parameters `--bboxmin` and `--bboxmax`. Please note that the bounding box must be provided as strings, formatted as `"(x,y,z)"`, where `x`, `y` and `z` are the coordinates of the min and max points of the bounding box.
+
+
+## Rendering, composition and animation
+
+The `view_sugar_results.ipynb` notebook and the `metrics.py` script provides examples of how to load a refined SuGaR model for rendering a scene with the hybrid representation and the Gaussian Splatting rasterizer. We will add more details about this in a near future.
+
+We also provide in the `blender` directory several python scripts to export from Blender composition and animation data of SuGaR meshes modified or animated within Blender. Additionally, we provide in the `sugar_scene/sugar_compositor.py` script a Python class that can be used to import such animation or composition data into PyTorch and apply it to the SuGaR hybrid representation.
+
+The hybrid representation allows for high-quality rendering of the scene with the Gaussian Splatting rasterizer, as shown below.
+
+
+
+
+
+The usage of these scripts and class may be a bit tricky, so we will add a detailed tutorial on how to use them in a near future.
+
+
+## Evaluation
+
+To evaluate the quality of the reconstructions, we provide a script `metrics.py` that computes the PSNR, SSIM and LPIPS metrics on test images. Start by optimizing SuGaR models for the desired scenes and a regularization method (`"density"` or `"sdf"`), then create a `.json` config file containing the paths to the scenes in the following format: `{source_images_dir_path: vanilla_gaussian_splatting_checkpoint_path}`.
+
+Finally, run the script as follows:
+
+```shell
+python metrics.py --scene_config -r <"sdf" or "density">
+```
+
+Results are saved in a `.json` file in the `output/metrics/` directory.
+Please refer to the script for more details on the command line arguments.
\ No newline at end of file
diff --git a/extern/sugar/blender/export_camera_trajectory.py b/extern/sugar/blender/export_camera_trajectory.py
new file mode 100644
index 0000000..681449e
--- /dev/null
+++ b/extern/sugar/blender/export_camera_trajectory.py
@@ -0,0 +1,42 @@
+import bpy
+import csv
+import os
+import json
+
+# Get the path to the current Blender file
+blend_file_path = bpy.data.filepath
+
+# Get the directory of the Blender file
+blend_dir = os.path.dirname(blend_file_path)
+
+# Construct the absolute path to the "data.csv" file
+path = './'
+data_file_path = os.path.join(blend_dir, path)
+data_file_path = os.path.join(data_file_path, "camera_trajectory.json")
+print('Camera poses file path:', data_file_path)
+
+res_dict = {}
+res_dict['matrix_world'] = []
+
+start_frame = 1
+end_frame = 71
+
+# Save frame poses for all bones
+for i_frame in range(start_frame, end_frame+1):
+ print('\nFrame', i_frame)
+
+ # Set frame
+ bpy.context.scene.frame_set(i_frame)
+
+ # Save camera pose
+ obj = bpy.context.active_object
+ res_dict['matrix_world'].append([[obj.matrix_world[i][j] for j in range(4)] for i in range(4)])
+
+ print('matrix world:')
+ print(res_dict['matrix_world'])
+
+
+with open(data_file_path, "w") as outfile:
+ json.dump(res_dict, outfile)
+ print(f'Results saved to "{data_file_path}".')
+
\ No newline at end of file
diff --git a/extern/sugar/blender/export_pose_bones.py b/extern/sugar/blender/export_pose_bones.py
new file mode 100644
index 0000000..a3c3296
--- /dev/null
+++ b/extern/sugar/blender/export_pose_bones.py
@@ -0,0 +1,62 @@
+import bpy
+import csv
+import os
+import json
+
+# Get the path to the current Blender file
+blend_file_path = bpy.data.filepath
+
+# Get the directory of the Blender file
+blend_dir = os.path.dirname(blend_file_path)
+
+path = './'
+
+# Construct the absolute path to the "data.csv" file
+data_file_path = os.path.join(blend_dir, path)
+data_file_path = os.path.join(data_file_path, "animation.json") # TODO: Depends on the animation
+print('Pose file path:', data_file_path)
+
+start_frame = 1
+end_frame = 100 # TODO: Depends on the animation
+
+res_dict = {}
+
+obj = bpy.context.active_object
+armature = obj.data
+
+# Initialize dict
+res_dict['matrix_world'] = [[obj.matrix_world[i][j] for j in range(4)] for i in range(4)]
+res_dict['rest_bones'] = {}
+res_dict['pose_bones'] = {}
+
+# Save rest poses for all bones
+for bone in armature.bones:
+ mat = obj.matrix_world @ bone.matrix_local
+ mat_list = [[mat[i][j] for j in range(4)] for i in range(4)]
+ res_dict['rest_bones'][bone.name] = mat_list
+ res_dict['pose_bones'][bone.name] = []
+
+# Save frame poses for all bones
+for i_frame in range(start_frame, end_frame+1):
+ print('Frame', i_frame)
+
+ # Set frame
+ bpy.context.scene.frame_set(i_frame)
+
+ # Get object transform
+ obj = bpy.context.active_object
+
+ pose = obj.pose
+ armature = obj.data
+
+ print("pose_position:", armature.pose_position)
+
+ for bone in pose.bones:
+ mat = obj.matrix_world @ bone.matrix
+ mat_list = [[mat[i][j] for j in range(4)] for i in range(4)]
+ res_dict['pose_bones'][bone.name].append(mat_list)
+
+with open(data_file_path, "w") as outfile:
+ json.dump(res_dict, outfile)
+ print(f'Results saved to "{data_file_path}".')
+
\ No newline at end of file
diff --git a/extern/sugar/blender/export_reference_points.py b/extern/sugar/blender/export_reference_points.py
new file mode 100644
index 0000000..29f9e11
--- /dev/null
+++ b/extern/sugar/blender/export_reference_points.py
@@ -0,0 +1,58 @@
+import bpy
+import csv
+import os
+import json
+
+# Get the path to the current Blender file
+blend_file_path = bpy.data.filepath
+
+# Get the directory of the Blender file
+blend_dir = os.path.dirname(blend_file_path)
+
+path = './'
+
+# Construct the absolute path to the "data.csv" file
+data_file_path = os.path.join(blend_dir, path)
+data_file_path = os.path.join(data_file_path, "scene_reference_points.json") # TODO: Depends on the scene
+print('Points file path:', data_file_path)
+
+# Get data
+obj = bpy.context.active_object
+m = bpy.context.object.evaluated_get(bpy.context.evaluated_depsgraph_get()).to_mesh()
+vertices = m.vertices
+print("Number of vertices:", len(m.vertices))
+
+# Initialize dict
+res_dict = {}
+res_dict['matrix_world'] = [[obj.matrix_world[i][j] for j in range(4)] for i in range(4)]
+res_dict['reference_points'] = []
+res_dict['groups'] = []
+res_dict['weights'] = []
+
+# Get names of all vertex groups
+vertex_group_names = {}
+print("Vertex groups:")
+for i in range(len(obj.vertex_groups)):
+ group = obj.vertex_groups[i]
+ print(group.index, ":", group.name)
+ vertex_group_names[str(group.index)] = group.name
+
+for i in range(len(vertices)):
+
+ # Add coordinates
+ v = obj.matrix_world @ vertices[i].co
+ res_dict['reference_points'].append([v[0], v[1], v[2]])
+
+ # Add the names of the corresponding vertex groups
+ group_list = []
+ weight_list = []
+ for group in vertices[i].groups:
+ group_list.append(vertex_group_names[str(group.group)])
+ weight_list.append(group.weight)
+ res_dict['groups'].append(group_list)
+ res_dict['weights'].append(weight_list)
+
+with open(data_file_path, "w") as outfile:
+ json.dump(res_dict, outfile)
+ print(f'Results saved to "{data_file_path}".')
+
\ No newline at end of file
diff --git a/extern/sugar/blender/export_tpose_bones.py b/extern/sugar/blender/export_tpose_bones.py
new file mode 100644
index 0000000..bb667a5
--- /dev/null
+++ b/extern/sugar/blender/export_tpose_bones.py
@@ -0,0 +1,37 @@
+import bpy
+import csv
+import os
+import json
+
+# Get the path to the current Blender file
+blend_file_path = bpy.data.filepath
+
+# Get the directory of the Blender file
+blend_dir = os.path.dirname(blend_file_path)
+
+path = './'
+
+# Construct the absolute path to the "data.csv" file
+data_file_path = os.path.join(blend_dir, path)
+data_file_path = os.path.join(data_file_path, "scene_tpose.json") # TODO: Depends on the scene
+print('Pose file path:', data_file_path)
+
+res_dict = {}
+
+obj = bpy.context.active_object
+armature = obj.data
+
+# Initialize dict
+res_dict['matrix_world'] = [[obj.matrix_world[i][j] for j in range(4)] for i in range(4)]
+res_dict['tpose_bones'] = {}
+
+# Save rest poses for all bones
+for bone in armature.bones:
+ mat = obj.matrix_world @ bone.matrix_local
+ mat_list = [[mat[i][j] for j in range(4)] for i in range(4)]
+ res_dict['tpose_bones'][bone.name] = mat_list
+
+with open(data_file_path, "w") as outfile:
+ json.dump(res_dict, outfile)
+ print(f'Results saved to "{data_file_path}".')
+
\ No newline at end of file
diff --git a/extern/sugar/blender/export_tpose_points.py b/extern/sugar/blender/export_tpose_points.py
new file mode 100644
index 0000000..0020290
--- /dev/null
+++ b/extern/sugar/blender/export_tpose_points.py
@@ -0,0 +1,58 @@
+import bpy
+import csv
+import os
+import json
+
+# Get the path to the current Blender file
+blend_file_path = bpy.data.filepath
+
+# Get the directory of the Blender file
+blend_dir = os.path.dirname(blend_file_path)
+
+path = './'
+
+# Construct the absolute path to the "data.csv" file
+data_file_path = os.path.join(blend_dir, path)
+data_file_path = os.path.join(data_file_path, "scene_tpose_points.json") # TODO: Depends on the scene
+print('Points file path:', data_file_path)
+
+# Get data
+obj = bpy.context.active_object
+m = bpy.context.object.evaluated_get(bpy.context.evaluated_depsgraph_get()).to_mesh()
+vertices = m.vertices
+print("Number of vertices:", len(m.vertices))
+
+# Initialize dict
+res_dict = {}
+res_dict['matrix_world'] = [[obj.matrix_world[i][j] for j in range(4)] for i in range(4)]
+res_dict['tpose_points'] = []
+res_dict['groups'] = []
+res_dict['weights'] = []
+
+# Get names of all vertex groups
+vertex_group_names = {}
+print("Vertex groups:")
+for i in range(len(obj.vertex_groups)):
+ group = obj.vertex_groups[i]
+ print(group.index, ":", group.name)
+ vertex_group_names[str(group.index)] = group.name
+
+for i in range(len(vertices)):
+
+ # Add coordinates
+ v = obj.matrix_world @ vertices[i].co
+ res_dict['tpose_points'].append([v[0], v[1], v[2]])
+
+ # Add the names of the corresponding vertex groups
+ group_list = []
+ weight_list = []
+ for group in vertices[i].groups:
+ group_list.append(vertex_group_names[str(group.group)])
+ weight_list.append(group.weight)
+ res_dict['groups'].append(group_list)
+ res_dict['weights'].append(weight_list)
+
+with open(data_file_path, "w") as outfile:
+ json.dump(res_dict, outfile)
+ print(f'Results saved to "{data_file_path}".')
+
\ No newline at end of file
diff --git a/extern/sugar/configs/metrics/scenes.json b/extern/sugar/configs/metrics/scenes.json
new file mode 100644
index 0000000..0332fc6
--- /dev/null
+++ b/extern/sugar/configs/metrics/scenes.json
@@ -0,0 +1,16 @@
+
+{
+ "../nerfs/data/nerfstudio/garden": "../nerfs/og_gaussian_splatting/output/b5287e20-5/",
+ "../nerfs/data/nerfstudio/kitchen": "../nerfs/og_gaussian_splatting/output/dbde7200-f/",
+ "../nerfs/data/nerfstudio/room": "../nerfs/og_gaussian_splatting/output/a4f1fad4-6/",
+ "../nerfs/data/nerfstudio/bicycle": "../nerfs/og_gaussian_splatting/output/7d853ef3-c/",
+ "../nerfs/data/nerfstudio/counter": "../nerfs/og_gaussian_splatting/output/56003f04-1/",
+ "../nerfs/data/nerfstudio/bonsai": "../nerfs/og_gaussian_splatting/output/24f88369-5/",
+ "../nerfs/data/nerfstudio/stump": "../nerfs/og_gaussian_splatting/output/c4e344f5-a/",
+
+ "../nerfs/data/nerfstudio/playroom": "../nerfs/og_gaussian_splatting/output/ebefd1c2-7/",
+ "../nerfs/data/nerfstudio/drjohnson": "../nerfs/og_gaussian_splatting/output/da1efd51-d/",
+
+ "../nerfs/data/nerfstudio/train": "../nerfs/og_gaussian_splatting/output/0b8edce2-7/",
+ "../nerfs/data/nerfstudio/truck": "../nerfs/og_gaussian_splatting/output/cc98c5bd-6/"
+}
\ No newline at end of file
diff --git a/extern/sugar/environment.yml b/extern/sugar/environment.yml
new file mode 100644
index 0000000..2becc95
--- /dev/null
+++ b/extern/sugar/environment.yml
@@ -0,0 +1,246 @@
+name: sugar
+channels:
+ - pytorch3d
+ - plotly
+ - iopath
+ - pytorch
+ - nvidia
+ - conda-forge
+ - defaults
+dependencies:
+ - _libgcc_mutex=0.1=main
+ - _openmp_mutex=5.1=1_gnu
+ - anyio=4.1.0=pyhd8ed1ab_0
+ - argon2-cffi=21.1.0=py39h3811e60_2
+ - arrow=1.3.0=pyhd8ed1ab_0
+ - asttokens=2.4.1=pyhd8ed1ab_0
+ - async-lru=2.0.4=pyhd8ed1ab_0
+ - attrs=23.1.0=pyh71513ae_1
+ - babel=2.14.0=pyhd8ed1ab_0
+ - beautifulsoup4=4.12.2=pyha770c72_0
+ - blas=1.0=mkl
+ - bleach=6.1.0=pyhd8ed1ab_0
+ - brotli-python=1.0.9=py39h6a678d5_7
+ - bzip2=1.0.8=h7b6447c_0
+ - ca-certificates=2023.11.17=hbcca054_0
+ - cached-property=1.5.2=hd8ed1ab_1
+ - cached_property=1.5.2=pyha770c72_1
+ - certifi=2023.11.17=pyhd8ed1ab_0
+ - cffi=1.16.0=py39h5eee18b_0
+ - charset-normalizer=2.0.4=pyhd3eb1b0_0
+ - colorama=0.4.6=pyhd8ed1ab_0
+ - cryptography=41.0.7=py39hdda0065_0
+ - cuda-cudart=11.8.89=0
+ - cuda-cupti=11.8.87=0
+ - cuda-libraries=11.8.0=0
+ - cuda-nvrtc=11.8.89=0
+ - cuda-nvtx=11.8.86=0
+ - cuda-runtime=11.8.0=0
+ - decorator=5.1.1=pyhd8ed1ab_0
+ - defusedxml=0.7.1=pyhd8ed1ab_0
+ - entrypoints=0.4=pyhd8ed1ab_0
+ - exceptiongroup=1.2.0=pyhd8ed1ab_0
+ - executing=2.0.1=pyhd8ed1ab_0
+ - ffmpeg=4.3=hf484d3e_0
+ - filelock=3.13.1=py39h06a4308_0
+ - fqdn=1.5.1=pyhd8ed1ab_0
+ - freetype=2.12.1=h4a9f257_0
+ - fvcore=0.1.5.post20221221=pyhd8ed1ab_0
+ - giflib=5.2.1=h5eee18b_3
+ - gmp=6.2.1=h295c915_3
+ - gmpy2=2.1.2=py39heeb90bb_0
+ - gnutls=3.6.15=he1e5248_0
+ - idna=3.4=py39h06a4308_0
+ - importlib-metadata=7.0.0=pyha770c72_0
+ - importlib_metadata=7.0.0=hd8ed1ab_0
+ - importlib_resources=6.1.1=pyhd8ed1ab_0
+ - intel-openmp=2023.1.0=hdb19cb5_46306
+ - iopath=0.1.9=py39
+ - ipykernel=5.5.5=py39hef51801_0
+ - ipython=8.18.1=pyh707e725_3
+ - ipython_genutils=0.2.0=py_1
+ - ipywidgets=8.1.1=pyhd8ed1ab_0
+ - isoduration=20.11.0=pyhd8ed1ab_0
+ - jedi=0.19.1=pyhd8ed1ab_0
+ - jinja2=3.1.2=py39h06a4308_0
+ - jpeg=9e=h5eee18b_1
+ - json5=0.9.14=pyhd8ed1ab_0
+ - jsonpointer=2.4=py39hf3d152e_3
+ - jsonschema=4.20.0=pyhd8ed1ab_0
+ - jsonschema-specifications=2023.11.2=pyhd8ed1ab_0
+ - jsonschema-with-format-nongpl=4.20.0=pyhd8ed1ab_0
+ - jupyter-lsp=2.2.1=pyhd8ed1ab_0
+ - jupyter_client=8.6.0=pyhd8ed1ab_0
+ - jupyter_core=5.5.0=py39hf3d152e_0
+ - jupyter_events=0.9.0=pyhd8ed1ab_0
+ - jupyter_server=2.12.1=pyhd8ed1ab_0
+ - jupyter_server_terminals=0.5.0=pyhd8ed1ab_0
+ - jupyterlab=4.0.9=pyhd8ed1ab_0
+ - jupyterlab_pygments=0.3.0=pyhd8ed1ab_0
+ - jupyterlab_server=2.25.2=pyhd8ed1ab_0
+ - jupyterlab_widgets=3.0.9=pyhd8ed1ab_0
+ - lame=3.100=h7b6447c_0
+ - lcms2=2.12=h3be6417_0
+ - ld_impl_linux-64=2.38=h1181459_1
+ - lerc=3.0=h295c915_0
+ - libcublas=11.11.3.6=0
+ - libcufft=10.9.0.58=0
+ - libcufile=1.8.1.2=0
+ - libcurand=10.3.4.101=0
+ - libcusolver=11.4.1.48=0
+ - libcusparse=11.7.5.86=0
+ - libdeflate=1.17=h5eee18b_1
+ - libffi=3.4.4=h6a678d5_0
+ - libgcc=7.2.0=h69d50b8_2
+ - libgcc-ng=11.2.0=h1234567_1
+ - libgomp=11.2.0=h1234567_1
+ - libiconv=1.16=h7f8727e_2
+ - libidn2=2.3.4=h5eee18b_0
+ - libnpp=11.8.0.86=0
+ - libnvjpeg=11.9.0.86=0
+ - libpng=1.6.39=h5eee18b_0
+ - libsodium=1.0.18=h36c2ea0_1
+ - libstdcxx-ng=11.2.0=h1234567_1
+ - libtasn1=4.19.0=h5eee18b_0
+ - libtiff=4.5.1=h6a678d5_0
+ - libunistring=0.9.10=h27cfd23_0
+ - libwebp=1.3.2=h11a3e52_0
+ - libwebp-base=1.3.2=h5eee18b_0
+ - lz4-c=1.9.4=h6a678d5_0
+ - markdown-it-py=3.0.0=pyhd8ed1ab_0
+ - markupsafe=2.1.1=py39h7f8727e_0
+ - matplotlib-inline=0.1.6=pyhd8ed1ab_0
+ - mdurl=0.1.0=pyhd8ed1ab_0
+ - mistune=3.0.2=pyhd8ed1ab_0
+ - mkl=2023.1.0=h213fc3f_46344
+ - mkl-service=2.4.0=py39h5eee18b_1
+ - mkl_fft=1.3.8=py39h5eee18b_0
+ - mkl_random=1.2.4=py39hdb19cb5_0
+ - mpc=1.1.0=h10f8cd9_1
+ - mpfr=4.0.2=hb69a4c5_1
+ - mpmath=1.3.0=py39h06a4308_0
+ - nbclient=0.8.0=pyhd8ed1ab_0
+ - nbconvert-core=7.12.0=pyhd8ed1ab_0
+ - ncurses=6.4=h6a678d5_0
+ - nettle=3.7.3=hbbd107a_1
+ - networkx=3.1=py39h06a4308_0
+ - nodejs=6.13.1=0
+ - notebook-shim=0.2.3=pyhd8ed1ab_0
+ - numpy=1.26.2=py39h5f9d8c6_0
+ - numpy-base=1.26.2=py39hb5e798b_0
+ - openh264=2.1.1=h4ff587b_0
+ - openjpeg=2.4.0=h3ad879b_0
+ - openssl=3.0.12=h7f8727e_0
+ - overrides=7.4.0=pyhd8ed1ab_0
+ - packaging=23.2=pyhd8ed1ab_0
+ - pandocfilters=1.5.0=pyhd8ed1ab_0
+ - parso=0.8.3=pyhd8ed1ab_0
+ - pickleshare=0.7.5=py_1003
+ - pillow=10.0.1=py39ha6cbd5a_0
+ - pip=23.3.1=py39h06a4308_0
+ - pkgutil-resolve-name=1.3.10=pyhd8ed1ab_1
+ - platformdirs=4.1.0=pyhd8ed1ab_0
+ - plotly=5.18.0=py_0
+ - plyfile=0.8.1=pyhd8ed1ab_0
+ - portalocker=2.8.2=py39hf3d152e_1
+ - prometheus_client=0.19.0=pyhd8ed1ab_0
+ - ptyprocess=0.7.0=pyhd3deb0d_0
+ - pure_eval=0.2.2=pyhd8ed1ab_0
+ - pycparser=2.21=pyhd3eb1b0_0
+ - pygments=2.17.2=pyhd8ed1ab_0
+ - pyopenssl=23.2.0=py39h06a4308_0
+ - pysocks=1.7.1=py39h06a4308_0
+ - python=3.9.18=h955ad1f_0
+ - python-dateutil=2.8.2=pyhd8ed1ab_0
+ - python-fastjsonschema=2.19.0=pyhd8ed1ab_0
+ - python-json-logger=2.0.7=pyhd8ed1ab_0
+ - python_abi=3.9=2_cp39
+ - pytorch=2.0.1=py3.9_cuda11.8_cudnn8.7.0_0
+ - pytorch-cuda=11.8=h7e8668a_5
+ - pytorch-mutex=1.0=cuda
+ - pytorch3d=0.7.4=py39_cu118_pyt201
+ - pytz=2023.3.post1=pyhd8ed1ab_0
+ - pyyaml=6.0=py39hb9d737c_4
+ - pyzmq=25.1.0=py39h6a678d5_0
+ - readline=8.2=h5eee18b_0
+ - referencing=0.32.0=pyhd8ed1ab_0
+ - requests=2.31.0=py39h06a4308_0
+ - rfc3339-validator=0.1.4=pyhd8ed1ab_0
+ - rfc3986-validator=0.1.1=pyh9f0ad1d_0
+ - rich=13.7.0=pyhd8ed1ab_0
+ - send2trash=1.8.2=pyh41d4057_0
+ - setuptools=68.2.2=py39h06a4308_0
+ - six=1.16.0=pyh6c4a22f_0
+ - sniffio=1.3.0=pyhd8ed1ab_0
+ - soupsieve=2.5=pyhd8ed1ab_1
+ - sqlite=3.41.2=h5eee18b_0
+ - stack_data=0.6.2=pyhd8ed1ab_0
+ - sympy=1.12=py39h06a4308_0
+ - tabulate=0.9.0=pyhd8ed1ab_1
+ - tbb=2021.8.0=hdb19cb5_0
+ - tenacity=8.2.2=py39h06a4308_0
+ - termcolor=2.3.0=pyhd8ed1ab_0
+ - terminado=0.18.0=pyh0d859eb_0
+ - tinycss2=1.2.1=pyhd8ed1ab_0
+ - tk=8.6.12=h1ccaba5_0
+ - tomli=2.0.1=pyhd8ed1ab_0
+ - torchaudio=2.0.2=py39_cu118
+ - torchtriton=2.0.0=py39
+ - torchvision=0.15.2=py39_cu118
+ - tornado=6.3.3=py39h5eee18b_0
+ - tqdm=4.66.1=pyhd8ed1ab_0
+ - traitlets=5.14.0=pyhd8ed1ab_0
+ - types-python-dateutil=2.8.19.14=pyhd8ed1ab_0
+ - typing_extensions=4.7.1=py39h06a4308_0
+ - typing_utils=0.1.0=pyhd8ed1ab_0
+ - uri-template=1.3.0=pyhd8ed1ab_0
+ - urllib3=1.26.18=py39h06a4308_0
+ - wcwidth=0.2.12=pyhd8ed1ab_0
+ - webcolors=1.13=pyhd8ed1ab_0
+ - webencodings=0.5.1=pyhd8ed1ab_2
+ - websocket-client=1.7.0=pyhd8ed1ab_0
+ - wheel=0.41.2=py39h06a4308_0
+ - widgetsnbextension=4.0.9=pyhd8ed1ab_0
+ - xz=5.4.5=h5eee18b_0
+ - yacs=0.1.8=pyhd8ed1ab_0
+ - yaml=0.2.5=h7f98852_2
+ - zeromq=4.3.4=h2531618_0
+ - zipp=3.17.0=pyhd8ed1ab_0
+ - zlib=1.2.13=h5eee18b_0
+ - zstd=1.5.5=hc292b87_0
+ - pip:
+ - addict==2.4.0
+ - ansi2html==1.9.1
+ - blinker==1.7.0
+ - click==8.1.7
+ - comm==0.2.0
+ - configargparse==1.7
+ - contourpy==1.2.0
+ - cycler==0.12.1
+ - dash==2.14.2
+ - dash-core-components==2.0.0
+ - dash-html-components==2.0.0
+ - dash-table==5.0.0
+ - flask==3.0.0
+ - fonttools==4.46.0
+ - itsdangerous==2.1.2
+ - joblib==1.3.2
+ - kiwisolver==1.4.5
+ - matplotlib==3.8.2
+ - nbformat==5.7.0
+ - nest-asyncio==1.5.8
+ - open3d==0.17.0
+ - pandas==2.1.4
+ - pexpect==4.9.0
+ - prompt-toolkit==3.0.43
+ - pymcubes==0.1.4
+ - pyparsing==3.1.1
+ - pyquaternion==0.9.9
+ - retrying==1.3.4
+ - rpds-py==0.15.2
+ - scikit-learn==1.3.2
+ - scipy==1.11.4
+ - stack-data==0.6.3
+ - threadpoolctl==3.2.0
+ - tzdata==2023.3
+ - werkzeug==3.0.1
diff --git a/extern/sugar/extract_mesh.py b/extern/sugar/extract_mesh.py
new file mode 100644
index 0000000..3734006
--- /dev/null
+++ b/extern/sugar/extract_mesh.py
@@ -0,0 +1,110 @@
+import os
+import json
+import argparse
+import numpy as np
+from sugar_utils.general_utils import str2bool
+from sugar_extractors.coarse_mesh import extract_mesh_from_coarse_sugar
+
+import sys
+sys.path.append(".")
+
+from threestudio.data.uncond import RandomCameraIterableDataset, RandomCameraDataModuleConfig
+from threestudio.utils.config import load_config, parse_structured
+from threestudio.utils.ops import convert_pose
+
+if __name__ == "__main__":
+ # Parser
+ parser = argparse.ArgumentParser(description='Script to extract a mesh from a coarse SuGaR scene.')
+ parser.add_argument('-s', '--scene_path',
+ type=str,
+ default="./load/scene",
+ help='path to the scene data to use.')
+ parser.add_argument('-c', '--checkpoint_path',
+ type=str,
+ help='path to the vanilla 3D Gaussian Splatting Checkpoint to load.')
+ parser.add_argument('-i', '--iteration_to_load',
+ type=int, default=7000,
+ help='iteration to load.')
+
+ parser.add_argument('-m', '--coarse_model_path', type=str, default=None, help='')
+
+ parser.add_argument('-l', '--surface_level', type=float, default=0.3,
+ help='Surface level to extract the mesh at. If None, will extract levels 0.1, 0.3 and 0.5')
+ parser.add_argument('-d', '--decimation_target', type=int, default=200_000,
+ help='Target number of vertices to decimate the mesh to. If None, will decimate to 200_000 and 1_000_000.')
+
+ parser.add_argument('-o', '--mesh_output_dir',
+ type=str, default=None,
+ help='path to the output directory.')
+
+ parser.add_argument('-b', '--bboxmin', type=str, default=None, help='Min coordinates to use for foreground.')
+ parser.add_argument('-B', '--bboxmax', type=str, default=None, help='Max coordinates to use for foreground.')
+ parser.add_argument('--center_bbox', type=str2bool, default=False, help='If True, center the bounding box. Default is False.')
+
+ parser.add_argument('--gpu', type=int, default=0, help='Index of GPU device to use.')
+
+ parser.add_argument('--eval', type=str2bool, default=False, help='Use eval split.')
+ parser.add_argument('--use_centers_to_extract_mesh', type=str2bool, default=False,
+ help='If True, just use centers of the gaussians to extract mesh.')
+ parser.add_argument('--use_marching_cubes', type=str2bool, default=False,
+ help='If True, use marching cubes to extract mesh.')
+ parser.add_argument('--use_vanilla_3dgs', action="store_true", default=False,
+ help='If True, use vanilla 3DGS to extract mesh.')
+ parser.add_argument("--camera_pose_path", type=str, default=None,
+ help='The path of .json file that stores camera poses.')
+
+ parser.add_argument("--poisson_depth", type=int, default=6,
+ help="The depth used for poisson reconstruction")
+
+ args = parser.parse_args()
+
+ if args.camera_pose_path is None:
+ gs_path = args.checkpoint_path
+ cfg_path = os.path.join(
+ os.path.dirname(os.path.dirname(gs_path)), "configs", "parsed.yaml"
+ )
+ cfg = load_config(cfg_path)
+
+ random_camera_cfg = parse_structured(
+ RandomCameraDataModuleConfig, cfg.get('data', {})
+ )
+ camera_distance = random_camera_cfg.eval_camera_distance
+ random_camera_cfg.update(
+ {
+ "camera_distance_range": [1.3, 2.0],
+ "elevation_range": [-45, 80],
+ "azimuth_range": [-180, 180]
+ }
+ )
+ random_camera_dataset = RandomCameraIterableDataset(random_camera_cfg)
+
+ cfg_list = []
+ for i in range(1000):
+ random_camera = random_camera_dataset.collate(None)
+ # convert threestudio pose to 3d gaussian
+ c2w = convert_pose(random_camera['c2w'][0].cpu()).numpy()
+ camera_cfg = {}
+ camera_cfg['id'] = i
+ camera_cfg['img_name'] = f"{i}"
+ camera_cfg['width'] = 512
+ camera_cfg['height'] = 512
+ camera_cfg['position'] = c2w[:3, 3].tolist()
+ rot = c2w[:3, :3]
+ camera_cfg['rotation'] = [x.tolist() for x in rot]
+
+ fov = random_camera_cfg.eval_fovy_deg / 180 * np.pi
+ focal_length = 0.5 * 512 / np.tan(0.5 * fov)
+ camera_cfg['fx'] = focal_length
+ camera_cfg['fy'] = focal_length
+ cfg_list.append(camera_cfg)
+
+ camera_save_path = os.path.join(
+ os.path.dirname(gs_path), "cameras.json"
+ )
+ with open(camera_save_path, 'w') as f:
+ json.dump(cfg_list, f, indent=4)
+
+
+ # Call function
+ extract_mesh_from_coarse_sugar(args)
+
\ No newline at end of file
diff --git a/extern/sugar/extract_refined_mesh_with_texture.py b/extern/sugar/extract_refined_mesh_with_texture.py
new file mode 100644
index 0000000..4571df8
--- /dev/null
+++ b/extern/sugar/extract_refined_mesh_with_texture.py
@@ -0,0 +1,46 @@
+import argparse
+from sugar_utils.general_utils import str2bool
+from sugar_extractors.refined_mesh import extract_mesh_and_texture_from_refined_sugar
+
+if __name__ == "__main__":
+ # Parser
+ parser = argparse.ArgumentParser(description='Script to train a full macarons model in large 3D scenes.')
+ parser.add_argument('-s', '--scene_path',
+ type=str,
+ help='(Required) path to the scene data to use.') # --OK
+ parser.add_argument('-i', '--iteration_to_load',
+ type=int, default=7000,
+ help='iteration to load.') # --OK
+ parser.add_argument('-c', '--checkpoint_path',
+ type=str,
+ help='(Required) path to the vanilla 3D Gaussian Splatting Checkpoint to load.') # --OK
+ parser.add_argument('-m', '--refined_model_path',
+ type=str,
+ help='(Required) Path to the refine model checkpoint.') # --OK
+ parser.add_argument('-o', '--mesh_output_dir',
+ type=str,
+ default=None,
+ help='path to the output directory.') # --OK
+ parser.add_argument('-n', '--n_gaussians_per_surface_triangle',
+ default=None, type=int, help='Number of gaussians per surface triangle.') # --OK
+ parser.add_argument('--square_size',
+ default=None, type=int, help='Size of the square to use for the texture.') # --OK
+
+ parser.add_argument('--eval', type=str2bool, default=True, help='Use eval split.')
+ parser.add_argument('-g', '--gpu', type=int, default=0, help='Index of GPU to use.')
+
+ # Optional postprocessing
+ parser.add_argument('--postprocess_mesh', type=str2bool, default=False,
+ help='If True, postprocess the mesh by removing border triangles with low-density. '
+ 'This step takes a few minutes and is not needed in general, as it can also be risky. '
+ 'However, it increases the quality of the mesh in some cases, especially when an object is visible only from one side.') # --OK
+ parser.add_argument('--postprocess_density_threshold', type=float, default=0.1,
+ help='Threshold to use for postprocessing the mesh.') # --OK
+ parser.add_argument('--postprocess_iterations', type=int, default=5,
+ help='Number of iterations to use for postprocessing the mesh.') # --OK
+
+ args = parser.parse_args()
+
+ # Call function
+ extract_mesh_and_texture_from_refined_sugar(args)
+
\ No newline at end of file
diff --git a/extern/sugar/gaussian_splatting/LICENSE.md b/extern/sugar/gaussian_splatting/LICENSE.md
new file mode 100644
index 0000000..c869e69
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/LICENSE.md
@@ -0,0 +1,83 @@
+Gaussian-Splatting License
+===========================
+
+**Inria** and **the Max Planck Institut for Informatik (MPII)** hold all the ownership rights on the *Software* named **gaussian-splatting**.
+The *Software* is in the process of being registered with the Agence pour la Protection des
+Programmes (APP).
+
+The *Software* is still being developed by the *Licensor*.
+
+*Licensor*'s goal is to allow the research community to use, test and evaluate
+the *Software*.
+
+## 1. Definitions
+
+*Licensee* means any person or entity that uses the *Software* and distributes
+its *Work*.
+
+*Licensor* means the owners of the *Software*, i.e Inria and MPII
+
+*Software* means the original work of authorship made available under this
+License ie gaussian-splatting.
+
+*Work* means the *Software* and any additions to or derivative works of the
+*Software* that are made available under this License.
+
+
+## 2. Purpose
+This license is intended to define the rights granted to the *Licensee* by
+Licensors under the *Software*.
+
+## 3. Rights granted
+
+For the above reasons Licensors have decided to distribute the *Software*.
+Licensors grant non-exclusive rights to use the *Software* for research purposes
+to research users (both academic and industrial), free of charge, without right
+to sublicense.. The *Software* may be used "non-commercially", i.e., for research
+and/or evaluation purposes only.
+
+Subject to the terms and conditions of this License, you are granted a
+non-exclusive, royalty-free, license to reproduce, prepare derivative works of,
+publicly display, publicly perform and distribute its *Work* and any resulting
+derivative works in any form.
+
+## 4. Limitations
+
+**4.1 Redistribution.** You may reproduce or distribute the *Work* only if (a) you do
+so under this License, (b) you include a complete copy of this License with
+your distribution, and (c) you retain without modification any copyright,
+patent, trademark, or attribution notices that are present in the *Work*.
+
+**4.2 Derivative Works.** You may specify that additional or different terms apply
+to the use, reproduction, and distribution of your derivative works of the *Work*
+("Your Terms") only if (a) Your Terms provide that the use limitation in
+Section 2 applies to your derivative works, and (b) you identify the specific
+derivative works that are subject to Your Terms. Notwithstanding Your Terms,
+this License (including the redistribution requirements in Section 3.1) will
+continue to apply to the *Work* itself.
+
+**4.3** Any other use without of prior consent of Licensors is prohibited. Research
+users explicitly acknowledge having received from Licensors all information
+allowing to appreciate the adequacy between of the *Software* and their needs and
+to undertake all necessary precautions for its execution and use.
+
+**4.4** The *Software* is provided both as a compiled library file and as source
+code. In case of using the *Software* for a publication or other results obtained
+through the use of the *Software*, users are strongly encouraged to cite the
+corresponding publications as explained in the documentation of the *Software*.
+
+## 5. Disclaimer
+
+THE USER CANNOT USE, EXPLOIT OR DISTRIBUTE THE *SOFTWARE* FOR COMMERCIAL PURPOSES
+WITHOUT PRIOR AND EXPLICIT CONSENT OF LICENSORS. YOU MUST CONTACT INRIA FOR ANY
+UNAUTHORIZED USE: stip-sophia.transfert@inria.fr . ANY SUCH ACTION WILL
+CONSTITUTE A FORGERY. THIS *SOFTWARE* IS PROVIDED "AS IS" WITHOUT ANY WARRANTIES
+OF ANY NATURE AND ANY EXPRESS OR IMPLIED WARRANTIES, WITH REGARDS TO COMMERCIAL
+USE, PROFESSIONNAL USE, LEGAL OR NOT, OR OTHER, OR COMMERCIALISATION OR
+ADAPTATION. UNLESS EXPLICITLY PROVIDED BY LAW, IN NO EVENT, SHALL INRIA OR THE
+AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE
+GOODS OR SERVICES, LOSS OF USE, DATA, OR PROFITS OR BUSINESS INTERRUPTION)
+HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
+LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING FROM, OUT OF OR
+IN CONNECTION WITH THE *SOFTWARE* OR THE USE OR OTHER DEALINGS IN THE *SOFTWARE*.
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/.gitignore b/extern/sugar/gaussian_splatting/SIBR_viewers/.gitignore
new file mode 100644
index 0000000..3ffaa95
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/.gitignore
@@ -0,0 +1,45 @@
+extlibs/
+build/
+install/
+src/projects/*
+cmake-gui.exe.stackdump
+__pycache__/
+
+# emacs garbage
+\#*
+.\#*
+
+# vim garbage
+*.swp
+*.swo
+*.idea/
+*.log
+*.sh
+*.tmp
+
+hs_err_*
+
+# re include common public projects
+!src/projects/ulr/
+!src/projects/dataset_tools/
+
+# more vim garbage
+# Swap
+[._]*.s[a-v][a-z]
+!*.svg # comment out if you don't need vector files
+[._]*.sw[a-p]
+[._]s[a-rt-v][a-z]
+[._]ss[a-gi-z]
+[._]sw[a-p]
+
+# Session
+Session.vim
+Sessionx.vim
+
+# Temporary
+.netrwhist
+*~
+# Auto-generated tag files
+tags
+# Persistent undo
+[._]*.un~
\ No newline at end of file
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/CMakeLists.txt b/extern/sugar/gaussian_splatting/SIBR_viewers/CMakeLists.txt
new file mode 100644
index 0000000..21a3fc8
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/CMakeLists.txt
@@ -0,0 +1,213 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+CMAKE_MINIMUM_REQUIRED(VERSION 3.22)
+
+set (CMAKE_SYSTEM_VERSION 10.0.15063.0 CACHE INTERNAL "Cmake system version" FORCE)
+PROJECT(sibr_projects)
+
+set(REQUIRED_VERSION "3.22.0")
+set(CHECKED_VERSION "3.27.0")
+
+if (CMAKE_VERSION VERSION_LESS REQUIRED_VERSION)
+ message(WARNING "Deprecated version of cmake. Please update to at least ${REQUIRED_VERSION} (${CHECKED_VERSION} recommended).")
+elseif (CMAKE_VERSION VERSION_GREATER CHECKED_VERSION)
+ message(WARNING "Untested version of cmake. If you checked everything is working properly, please update ${CHECKED_VERSION} in the main CmakeLists.txt with the version you tested.")
+endif()
+
+## Include cmake stuff (functions/macros) : Modules files
+if(WIN32)
+list(APPEND CMAKE_MODULE_PATH ${CMAKE_CURRENT_SOURCE_DIR}/cmake/windows)
+list(APPEND CMAKE_MODULE_PATH ${CMAKE_CURRENT_SOURCE_DIR}/cmake/windows/Modules)
+else()
+list(APPEND CMAKE_MODULE_PATH ${CMAKE_CURRENT_SOURCE_DIR}/cmake/linux)
+list(APPEND CMAKE_MODULE_PATH ${CMAKE_CURRENT_SOURCE_DIR}/cmake/linux/Modules)
+endif()
+set_property(GLOBAL PROPERTY USE_FOLDERS ON)
+
+## To maintain cmake versions compatibilities
+include(cmake_policies)
+setPolicies()
+
+include(git_describe)
+git_describe(GIT_BRANCH SIBR_CORE_BRANCH GIT_COMMIT_HASH SIBR_CORE_COMMIT_HASH GIT_TAG SIBR_CORE_TAG GIT_VERSION SIBR_CORE_VERSION)
+
+message(STATUS "SIBR version :\n BRANCH ${SIBR_CORE_BRANCH}\n COMMIT_HASH ${SIBR_CORE_COMMIT_HASH}\n TAG ${SIBR_CORE_TAG}\n VERSION ${SIBR_CORE_VERSION}")
+
+if(NOT WIN32)
+set(CMAKE_CXX_STANDARD 17)
+set(CMAKE_CXX_STANDARD_REQUIRED ON)
+endif()
+
+
+if (WIN32)
+ ## Allow C++11 + other flags
+ include(CheckCXXCompilerFlag)
+ get_filename_component(currentBuildTool ${CMAKE_BUILD_TOOL} NAME_WE) # tool that can launch the native build system. returned value may be the full path
+ if(${currentBuildTool} MATCHES "(msdev|devenv|nmake|MSBuild)")
+
+ add_compile_options("$<$:/W3;/DNOMINMAX;/MP;-D_USE_MATH_DEFINES>")
+ #add_definitions(/W3 /DNOMINMAX /MP -D_USE_MATH_DEFINES)# /D_ITERATOR_DEBUG_LEVEL=1 because you need all external DLl to compile with this flag too
+ set(CMAKE_CONFIGURATION_TYPES "RelWithDebInfo;Release;Debug" CACHE STRING "" FORCE)
+ set(CMAKE_CXX_STANDARD 14)
+ set(CMAKE_CXX_STANDARD_REQUIRED ON)
+ set(CMAKE_CXX_EXTENSIONS OFF)
+ elseif(${currentBuildTool} MATCHES "(make|gmake)")
+ add_definitions("-Wall -Wno-unknown-pragmas -Wno-sign-compare -g -std=c++14 -D__forceinline=\"inline\ __attribute__((always_inline))\"")
+ # CHECK_CXX_COMPILER_FLAG("-std=gnu++11" COMPILER_SUPPORTS_CXX11)
+ # CHECK_CXX_COMPILER_FLAG("-std=gnu++0x" COMPILER_SUPPORTS_CXX0X)
+ # if(COMPILER_SUPPORTS_CXX11)
+ # add_definitions(-std=gnu++11)
+ # elseif(COMPILER_SUPPORTS_CXX0X)
+ # add_definitions(-std=gnu++0x)
+ # else()
+ # message(SEND_ERROR "The compiler ${CMAKE_CXX_COMPILER} has no C++14 support. Please use a different C++ compiler.")
+ # endif()
+ elseif(APPLE) ## \todo TODO: do a better test and send error on unsupported c++14 compiler
+ add_definitions(-std=c++14 -stdlib=libc++)
+ endif()
+else()
+ ## Allow C++11 + other flags
+ include(CheckCXXCompilerFlag)
+ get_filename_component(currentBuildTool ${CMAKE_BUILD_TOOL} NAME_WE) # tool that can launch the native build system. returned value may be the full path
+ if(${currentBuildTool} MATCHES "(msdev|devenv|nmake|MSBuild)")
+
+ add_compile_options("$<$:/W3;/DNOMINMAX;/MP;-D_USE_MATH_DEFINES>")
+ #add_definitions(/W3 /DNOMINMAX /MP -D_USE_MATH_DEFINES)# /D_ITERATOR_DEBUG_LEVEL=1 because you need all external DLl to compile with this flag too
+ set(CMAKE_CONFIGURATION_TYPES "RelWithDebInfo;Release;Debug" CACHE STRING "" FORCE)
+ set(CMAKE_CXX_STANDARD 14)
+ set(CMAKE_CXX_STANDARD_REQUIRED ON)
+ set(CMAKE_CXX_EXTENSIONS OFF)
+ elseif(${currentBuildTool} MATCHES "(make|gmake|ninja)")
+ add_definitions("-fpermissive -fPIC -Wall -Wno-unknown-pragmas -Wno-sign-compare -g -std=c++17 -D__forceinline=\"inline\ __attribute__((always_inline))\"")
+ elseif(APPLE) ## \todo TODO: do a better test and send error on unsupported c++14 compiler
+ add_definitions(-std=c++17 -stdlib=libc++)
+ endif()
+endif()
+
+set(INSTALL_STANDALONE ON)
+
+## Set default build output binaries (used also in sub CMakeLists.txt) :
+set(BIN_BUILT_DIR "bin")
+if(CMAKE_SIZEOF_VOID_P EQUAL 8)
+ set(ARCHI_BUILT_DIR "x64")
+ set(LIB_BUILT_DIR "lib64")
+else()
+ set(ARCHI_BUILT_DIR "x86")
+ set(LIB_BUILT_DIR "lib")
+endif()
+
+option(SEPARATE_CONFIGURATIONS "Clearer separation between configurations" OFF)
+SET(CMAKE_INSTALL_ROOT ${CMAKE_CURRENT_SOURCE_DIR}/install)
+SET(CMAKE_INSTALL_PREFIX ${CMAKE_INSTALL_ROOT})
+
+if(DEFINED CMAKE_BUILD_TYPE) ## for mono config type (make/nmake/ninja based)
+ if(${CMAKE_BUILD_TYPE} MATCHES "Debug")
+ set(CMAKE_DEBUG_POSTFIX "_d")
+ elseif(${CMAKE_BUILD_TYPE} MATCHES "RelWithDebInfo")
+ set(CMAKE_RELWITHDEBINFO_POSTFIX "_rwdi")
+ elseif(${CMAKE_BUILD_TYPE} MATCHES "MinSizeRel")
+ set(CMAKE_MINSIZEREL_POSTFIX "_msr")
+ elseif(${CMAKE_BUILD_TYPE} MATCHES "Release")
+ set(CMAKE_RELEASE_POSTFIX "")
+ endif()
+
+ if(SEPARATE_CONFIGURATIONS)
+ SET(CMAKE_INSTALL_PREFIX_${CMAKE_BUILD_TYPE} ${CMAKE_INSTALL_ROOT}/${CMAKE_BUILD_TYPE})
+ else()
+ SET(CMAKE_INSTALL_PREFIX_${CMAKE_BUILD_TYPE} ${CMAKE_INSTALL_ROOT})
+ endif()
+
+ MESSAGE(STATUS "Install path set to ${CMAKE_INSTALL_PREFIX}.")
+ SET(CMAKE_OUTPUT_LIB_${CMAKE_BUILD_TYPE} ${CMAKE_INSTALL_PREFIX_${CMAKE_BUILD_TYPE}}/lib)
+ SET(CMAKE_OUTPUT_BIN_${CMAKE_BUILD_TYPE} ${CMAKE_INSTALL_PREFIX_${CMAKE_BUILD_TYPE}}/bin)
+
+ set(CMAKE_LIBRARY_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE} ${CMAKE_OUTPUT_LIB_${CMAKE_BUILD_TYPE}})
+ set(CMAKE_ARCHIVE_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE} ${CMAKE_OUTPUT_LIB_${CMAKE_BUILD_TYPE}})
+ set(CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE} ${CMAKE_OUTPUT_BIN_${CMAKE_BUILD_TYPE}})
+ set(CMAKE_PDB_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE} ${CMAKE_OUTPUT_BIN_${CMAKE_BUILD_TYPE}})
+endif()
+foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ if(${CONFIG_TYPES} MATCHES "Debug")
+ set(CMAKE_DEBUG_POSTFIX "_d")
+ elseif(${CONFIG_TYPES} MATCHES "RelWithDebInfo")
+ set(CMAKE_RELWITHDEBINFO_POSTFIX "_rwdi")
+ elseif(${CONFIG_TYPES} MATCHES "MinSizeRel")
+ set(CMAKE_MINSIZEREL_POSTFIX "_msr")
+ elseif(${CMAKE_BUILD_TYPE} MATCHES "Release")
+ set(CMAKE_RELEASE_POSTFIX "")
+ endif()
+
+ if(SEPARATE_CONFIGURATIONS)
+ SET(CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC} ${CMAKE_INSTALL_ROOT}/${CONFIG_TYPES})
+ else()
+ SET(CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC} ${CMAKE_INSTALL_ROOT})
+ endif()
+
+ MESSAGE(STATUS "Install path for ${CONFIG_TYPES} set to ${CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC}}.")
+ SET(CMAKE_OUTPUT_LIB_${CONFIG_TYPES_UC} ${CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC}}/lib)
+ SET(CMAKE_OUTPUT_BIN_${CONFIG_TYPES_UC} ${CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC}}/bin)
+
+ set(CMAKE_LIBRARY_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC} ${CMAKE_OUTPUT_LIB_${CONFIG_TYPES_UC}})
+ set(CMAKE_ARCHIVE_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC} ${CMAKE_OUTPUT_LIB_${CONFIG_TYPES_UC}})
+ set(CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC} ${CMAKE_OUTPUT_BIN_${CONFIG_TYPES_UC}})
+ set(CMAKE_PDB_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC} ${CMAKE_OUTPUT_BIN_${CONFIG_TYPES_UC}})
+endforeach()
+
+
+# Settings for RPATH
+if (NOT WIN32)
+ # Default config of Fedora at INRIA has no LD_LIBRARY_PATH (for security reasons I guess)
+ # So at least I had "./" in RPATH and found link paths
+ #set(CMAKE_SKIP_RPATH TRUE)
+ #SET(CMAKE_SKIP_BUILD_RPATH FALSE)
+ SET(CMAKE_BUILD_WITH_INSTALL_RPATH TRUE)
+
+ SET(CMAKE_INSTALL_RPATH "$ORIGIN")
+ #SET(CMAKE_INSTALL_RPATH "./")
+ #SET(CMAKE_INSTALL_RPATH "./:/usr/lib64/:/usr/lib/:/usr/local/lib64/:/usr/local/lib/") # This one causes be a problem -> a "default" version of libGL (swrast) is located in /usr/lib64 and was selected instead of nvidia one (in /usr/lib64/nividia)
+
+ SET(CMAKE_INSTALL_RPATH_USE_LINK_PATH TRUE)
+endif()
+
+
+set(SIBR_PROGRAMARGS "" CACHE STRING "Default program arguments used in Visual Studio target properties")
+if ("${SIBR_PROGRAMARGS}" STREQUAL "")
+ if (DEFINED ENV{SIBR_PROGRAMARGS})
+ set(SIBR_PROGRAMARGS "$ENV{SIBR_PROGRAMARGS}" CACHE STRING "Default program arguments used in Visual Studio target properties" FORCE)
+ message( STATUS "Using program options found in environment variable 'SIBR_PROGRAMARGS' => '${SIBR_PROGRAMARGS}'")
+ else()
+ message(
+ "Note you can provide default program options for Visual Studio target properties by either setting"
+ " a value for the cmake cached variable 'SIBR_PROGRAMARGS' or by setting a new environment "
+ "variable 'SIBR_PROGRAMARGS'")
+ endif()
+endif()
+
+add_custom_target(PREBUILD ALL)
+
+## Include all projects
+set(SIBR_PROJECTS_SAMPLES_SUBPAGE_REF "")
+set(SIBR_PROJECTS_OURS_SUBPAGE_REF "")
+set(SIBR_PROJECTS_TOOLBOX_SUBPAGE_REF "")
+set(SIBR_PROJECTS_OTHERS_SUBPAGE_REF "")
+set(SIBR_PROJECTS_SAMPLES_REF_REF "")
+set(SIBR_PROJECTS_OURS_REF_REF "")
+set(SIBR_PROJECTS_TOOLBOX_REF_REF "")
+set(SIBR_PROJECTS_OTHERS_REF_REF "")
+set(DOXY_APP_SPECIFIC_IMG_PATH "")
+set(DOXY_DOC_EXCLUDE_PATTERNS_DIRS "")
+ADD_SUBDIRECTORY(src)
+
+
+## handle documentation
+if (WIN32)
+ADD_SUBDIRECTORY(docs)
+endif()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/LICENSE.md b/extern/sugar/gaussian_splatting/SIBR_viewers/LICENSE.md
new file mode 100644
index 0000000..f32fc4d
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/LICENSE.md
@@ -0,0 +1,83 @@
+SIBR License
+============
+
+**Inria** and **UCA** hold all the ownership rights on the *Software* named **sibr-core**.
+The *Software* has been registered with the Agence pour la Protection des
+Programmes (APP) under IDDN.FR.001.430020.000.S.P.2019.000.31235
+
+The *Software* is still being developed by the *Licensor*.
+
+*Licensor*'s goal is to allow the research community to use, test and evaluate
+the *Software*.
+
+## 1. Definitions
+
+*Licensee* means any person or entity that uses the *Software* and distributes
+its *Work*.
+
+*Licensor* means the owners of the *Software*, i.e Inria and UCA
+
+*Software* means the original work of authorship made available under this
+License ie Sibr-core.
+
+*Work* means the *Software* and any additions to or derivative works of the
+*Software* that are made available under this License.
+
+
+## 2. Purpose
+This license is intended to define the rights granted to the *Licensee* by
+Licensors under the *Software*.
+
+## 3. Rights granted
+
+For the above reasons Licensors have decided to distribute the *Software*.
+Licensors grant non-exclusive rights to use the *Software* for research purposes
+to research users (both academic and industrial), free of charge, without right
+to sublicense.. The *Software* may be used "non-commercially", i.e., for research
+and/or evaluation purposes only.
+
+Subject to the terms and conditions of this License, you are granted a
+non-exclusive, royalty-free, license to reproduce, prepare derivative works of,
+publicly display, publicly perform and distribute its *Work* and any resulting
+derivative works in any form.
+
+## 4. Limitations
+
+**4.1 Redistribution.** You may reproduce or distribute the *Work* only if (a) you do
+so under this License, (b) you include a complete copy of this License with
+your distribution, and (c) you retain without modification any copyright,
+patent, trademark, or attribution notices that are present in the *Work*.
+
+**4.2 Derivative Works.** You may specify that additional or different terms apply
+to the use, reproduction, and distribution of your derivative works of the *Work*
+("Your Terms") only if (a) Your Terms provide that the use limitation in
+Section 2 applies to your derivative works, and (b) you identify the specific
+derivative works that are subject to Your Terms. Notwithstanding Your Terms,
+this License (including the redistribution requirements in Section 3.1) will
+continue to apply to the *Work* itself.
+
+**4.3** Any other use without of prior consent of Licensors is prohibited. Research
+users explicitly acknowledge having received from Licensors all information
+allowing to appreciate the adequacy between of the *Software* and their needs and
+to undertake all necessary precautions for its execution and use.
+
+**4.4** The *Software* is provided both as a compiled library file and as source
+code. In case of using the *Software* for a publication or other results obtained
+through the use of the *Software*, users are strongly encouraged to cite the
+corresponding publications as explained in the documentation of the *Software*.
+
+## 5. Disclaimer
+
+THE USER CANNOT USE, EXPLOIT OR DISTRIBUTE THE *SOFTWARE* FOR COMMERCIAL PURPOSES
+WITHOUT PRIOR AND EXPLICIT CONSENT OF LICENSORS. YOU MUST CONTACT INRIA FOR ANY
+UNAUTHORIZED USE: stip-sophia.transfert@inria.fr . ANY SUCH ACTION WILL
+CONSTITUTE A FORGERY. THIS *SOFTWARE* IS PROVIDED "AS IS" WITHOUT ANY WARRANTIES
+OF ANY NATURE AND ANY EXPRESS OR IMPLIED WARRANTIES, WITH REGARDS TO COMMERCIAL
+USE, PROFESSIONNAL USE, LEGAL OR NOT, OR OTHER, OR COMMERCIALISATION OR
+ADAPTATION. UNLESS EXPLICITLY PROVIDED BY LAW, IN NO EVENT, SHALL INRIA OR THE
+AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR
+CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE
+GOODS OR SERVICES, LOSS OF USE, DATA, OR PROFITS OR BUSINESS INTERRUPTION)
+HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT
+LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING FROM, OUT OF OR
+IN CONNECTION WITH THE *SOFTWARE* OR THE USE OR OTHER DEALINGS IN THE *SOFTWARE*.
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/README.md b/extern/sugar/gaussian_splatting/SIBR_viewers/README.md
new file mode 100644
index 0000000..381d6b0
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/README.md
@@ -0,0 +1,142 @@
+# SIBR Core
+
+**SIBR** is a System for Image-Based Rendering.
+It is built around the *sibr-core* in this repo and several *Projects* implementing published research papers.
+For more complete documentation, see here: [SIBR Documentation](https://sibr.gitlabpages.inria.fr)
+
+This **SIBR core** repository provides :
+- a basic Image-Based Renderer
+- a per-pixel implementation of Unstructured Lumigraph (ULR)
+- several dataset tools & pipelines do process input images
+
+Details on how to run in the documentation and in the section below.
+If you use this code in a publication, please cite the system as follows:
+
+```
+@misc{sibr2020,
+ author = "Bonopera, Sebastien and Esnault, Jerome and Prakash, Siddhant and Rodriguez, Simon and Thonat, Theo and Benadel, Mehdi and Chaurasia, Gaurav and Philip, Julien and Drettakis, George",
+ title = "sibr: A System for Image Based Rendering",
+ year = "2020",
+ url = "https://gitlab.inria.fr/sibr/sibr_core"
+}
+```
+
+## Setup
+
+**Note**: The current release is for *Windows 10* only. We are planning a Linux release soon.
+
+#### Binary distribution
+
+The easiest way to use SIBR is to download the binary distribution. All steps described below, including all preprocessing for your datasets will work using this code.
+
+Download the distribution from the page: https://sibr.gitlabpages.inria.fr/download.html (Core, 57Mb); unzip the file and rename the directory "install".
+
+#### Install requirements
+
+- [**Visual Studio 2019**](https://visualstudio.microsoft.com/fr/downloads/)
+- [**Cmake 3.16+**](https://cmake.org/download)
+- [**7zip**](https://www.7-zip.org)
+- [**Python 3.8+**](https://www.python.org/downloads/) for shaders installation scripts and dataset preprocess scripts
+- [**Doxygen 1.8.17+**](https://www.doxygen.nl/download.html#srcbin) for documentation
+- [**CUDA 10.1+**](https://developer.nvidia.com/cuda-downloads) and [**CUDnn**](https://developer.nvidia.com/cudnn) if projects requires it
+
+Make sure Python, CUDA and Doxygen are in the PATH
+
+If you have Chocolatey, you can grab most of these with this command:
+
+```sh
+choco install cmake 7zip python3 doxygen.install cuda
+
+## Visual Studio is available on Chocolatey,
+## though we do advise to set it from Visual Studio Installer and to choose your licensing accordingly
+choco install visualstudio2019community
+```
+
+#### Generation of the solution
+
+- Checkout this repository's master branch:
+
+ ```sh
+ ## through HTTPS
+ git clone https://gitlab.inria.fr/sibr/sibr_core.git -b master
+ ## through SSH
+ git clone git@gitlab.inria.fr:sibr/sibr_core.git -b master
+ ```
+- Run Cmake-gui once, select the repo root as a source directory, `build/` as the build directory. Configure, select the Visual Studio C++ Win64 compiler
+- Select the projects you want to generate among the BUILD elements in the list (you can group Cmake flags by categories to access those faster)
+- Generate
+
+#### Compilation
+
+- Open the generated Visual Studio solution (`build/sibr_projects.sln`)
+- Build the `ALL_BUILD` target, and then the `INSTALL` target
+- The compiled executables will be put in `install/bin`
+- TODO: are the DLLs properly installed?
+
+#### Compilation of the documentation
+
+- Open the generated Visual Studio solution (`build/sibr_projects.sln`)
+- Build the `DOCUMENTATION` target
+- Run `install/docs/index.html` in a browser
+
+
+## Scripts
+
+Some scripts will require you to install `PIL`, and `convert` from `ImageMagick`.
+
+```sh
+## To install pillow
+python -m pip install pillow
+
+## If you have Chocolatey, you can install imagemagick from this command
+choco install imagemagick
+```
+
+## Troubleshooting
+
+#### Bugs and Issues
+
+We will track bugs and issues through the Issues interface on gitlab. Inria gitlab does not allow creation of external accounts, so if you have an issue/bug please email sibr@inria.fr and we will either create a guest account or create the issue on our side.
+
+#### Cmake complaining about the version
+
+if you are the first to use a very recent Cmake version, you will have to update `CHECKED_VERSION` in the root `CmakeLists.txt`.
+
+#### Weird OpenCV error
+
+you probably selected the 32-bits compiler in Cmake-gui.
+
+#### `Cmd.exe failed with error 009` or similar
+
+make sure Python is installed and in the path.
+
+#### `BUILD_ALL` or `INSTALL` fail because of a project you don't really need
+
+build and install each project separately by selecting the proper targets.
+
+#### Error in CUDA headers under Visual Studio 2019
+
+make sure CUDA >= 10.1 (first version to support VS2019) is installed.
+
+## To run an example
+
+For more details, please see the documentation: http://sibr.gitlabpages.inria.fr
+
+Download a dataset from: https://repo-sam.inria.fr/fungraph/sibr-datasets/
+
+e.g., the *sibr-museum-front* dataset in the *DATASETS_PATH* directory.
+
+```
+wget https://repo-sam.inria.fr/fungraph/sibr-datasets/museum_front27_ulr.zip
+```
+
+Once you have built the system or downloaded the binaries (see above), go to *install/bin* and you can run:
+```
+ sibr_ulrv2_app.exe --path DATASETS_PATH/sibr-museum-front
+```
+
+You will have an interactive viewer and you can navigate freely in the captured scene.
+Our default interactive viewer has a main view running the algorithm and a top view to visualize the position of the calibrated cameras. By default you are in WASD mode, and can toggle to trackball using the "y" key. Please see the page [Interface](https://sibr.gitlabpages.inria.fr/docs/nightly/howto_sibr_useful_objects.html) for more details on the interface.
+
+Please see the documentation on how to create a dataset from your own scene, and the various other IBR algorithms available.
+
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/MSVCsetUserCommand.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/MSVCsetUserCommand.cmake
new file mode 100644
index 0000000..bc49770
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/MSVCsetUserCommand.cmake
@@ -0,0 +1,149 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+if(__MSVCsetUserCommand_cmake_INCLUDED__)
+ return()
+else()
+ set(__MSVCsetUserCommand_cmake_INCLUDED__ ON)
+endif()
+
+## Allow to configure the Debugger settings of visual studio
+## Note: Using this command under linux doesn't affect anything
+## On run Debug Windows local : visual will try to load a specific COMMAND with ARGS in the provided WORKING_DIR
+##
+## usage:
+## MSVCsetUserCommand(
+## [COMMAND | [ PATH [FILE ] ] ]
+## ARGS
+## WORKING_DIR
+## )
+##
+## Warning 1 : All arugments () must be passed under quotes
+## Warning 2 : WORKING_DIR path arg have to finish with remain slah '/'
+## Warning 3 : use COMMAND for external app OR PATH (optionaly with FILE) option(s) to set your built/installed/moved target
+##
+## Example 1:
+## include(MSVCsetUserCommand)
+## MSVCsetUserCommand( UnityRenderingPlugin
+## COMMAND "C:/Program Files (x86)/Unity/Editor/Unity.exe"
+## ARGS "-force-opengl -projectPath \"${CMAKE_HOME_DIRECTORY}/UnityPlugins/RenderingPluginExample/UnityProject\""
+## WORKING_DIR "${CMAKE_HOME_DIRECTORY}/UnityPlugins/RenderingPluginExample/UnityProject"
+## VERBOSE
+## )
+##
+## Example 2:
+## include(MSVCsetUserCommand)
+## MSVCsetUserCommand( ibrApp
+## PATH "C:/Program Files (x86)/workspace/IBR/install"
+## FILE "ibrApp${CMAKE_EXECUTABLE_SUFFIX}" ## this option line is optional since the target name didn't change between build and install step
+## ARGS "-path \"${CMAKE_HOME_DIRECTORY}/dataset\""
+## WORKING_DIR "${CMAKE_HOME_DIRECTORY}"
+## VERBOSE
+## )
+##
+function(MSVCsetUserCommand targetName)
+ cmake_parse_arguments(MSVCsuc "VERBOSE" "PATH;FILE;COMMAND;ARGS;WORKING_DIR" "" ${ARGN} )
+
+ ## If no arguments are given, do not create an unecessary .vcxproj.user file
+ set(MSVCsuc_DEFAULT OFF)
+
+ if(MSVCsuc_PATH AND MSVCsuc_DEFAULT)
+ set(MSVCsuc_DEFAULT OFF)
+ endif()
+
+ if(MSVCsuc_FILE AND MSVCsuc_DEFAULT)
+ set(MSVCsuc_DEFAULT OFF)
+ endif()
+
+ if(NOT MSVCsuc_COMMAND)
+ if(MSVCsuc_PATH AND MSVCsuc_FILE)
+ set(MSVCsuc_COMMAND "${MSVCsuc_PATH}\\${MSVCsuc_FILE}")
+ elseif(MSVCsuc_PATH)
+ set(MSVCsuc_COMMAND "${MSVCsuc_PATH}\\$(TargetFileName)")
+ else()
+ set(MSVCsuc_COMMAND "$(TargetPath)") ## => $(TargetDir)\$(TargetName)$(TargetExt)
+ endif()
+ elseif(MSVCsuc_DEFAULT)
+ set(MSVCsuc_DEFAULT OFF)
+ endif()
+
+ # NOTE: there was a typo here. there is an else if written after else statement
+ # changing the order of the else if statement
+ if(MSVCsuc_WORKING_DIR)
+ file(TO_NATIVE_PATH ${MSVCsuc_WORKING_DIR} MSVCsuc_WORKING_DIR)
+ elseif(MSVCsuc_DEFAULT)
+ set(MSVCsuc_DEFAULT OFF)
+ else()
+ set(MSVCsuc_WORKING_DIR "$(ProjectDir)")
+ endif()
+
+ if(NOT MSVCsuc_ARGS)
+ set(MSVCsuc_ARGS "")
+ elseif(MSVCsuc_DEFAULT)
+ set(MSVCsuc_DEFAULT OFF)
+ endif()
+
+ if(MSVC10 OR (MSVC AND MSVC_VERSION GREATER 1600)) # 2010 or newer
+
+ if(CMAKE_SIZEOF_VOID_P EQUAL 8)
+ set(PLATEFORM_BITS x64)
+ else()
+ set(PLATEFORM_BITS Win32)
+ endif()
+
+ if(NOT MSVCsuc_DEFAULT AND PLATEFORM_BITS)
+
+ file(WRITE "${CMAKE_CURRENT_BINARY_DIR}/${targetName}.vcxproj.user"
+ "
+
+
+ ${MSVCsuc_COMMAND}
+ ${MSVCsuc_ARGS}
+ WindowsLocalDebugger
+ ${MSVCsuc_WORKING_DIR}
+
+
+ ${MSVCsuc_COMMAND}
+ ${MSVCsuc_ARGS}
+ WindowsLocalDebugger
+ ${MSVCsuc_WORKING_DIR}
+
+
+ ${MSVCsuc_COMMAND}
+ ${MSVCsuc_ARGS}
+ WindowsLocalDebugger
+ ${MSVCsuc_WORKING_DIR}
+
+
+ ${MSVCsuc_COMMAND}
+ ${MSVCsuc_ARGS}
+ WindowsLocalDebugger
+ ${MSVCsuc_WORKING_DIR}
+
+"
+ )
+ if(MSVCsuc_VERBOSE)
+ message(STATUS "[MSVCsetUserCommand] Write ${CMAKE_CURRENT_BINARY_DIR}/${targetName}.vcxproj.user file")
+ message(STATUS " to execute ${MSVCsuc_COMMAND} ${MSVCsuc_ARGS}")
+ message(STATUS " from derectory ${MSVCsuc_WORKING_DIR}")
+ message(STATUS " on visual studio run debugger button")
+ endif()
+
+ else()
+ message(WARNING "PLATEFORM_BITS is undefined...")
+ endif()
+
+ else()
+ if(MSVCsuc_VERBOSE)
+ message(WARNING "MSVCsetUserCommand is disable because too old MSVC is used (need MSVC10 2010 or newer)")
+ endif()
+ endif()
+
+endfunction()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindASSIMP.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindASSIMP.cmake
new file mode 100644
index 0000000..edfbb33
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindASSIMP.cmake
@@ -0,0 +1,114 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+## Try to find the ASSIMP library
+## Once done this will define
+##
+## ASSIMP_FOUND - system has ASSIMP
+## ASSIMP_INCLUDE_DIR - The ASSIMP include directory
+## ASSIMP_LIBRARIES - The libraries needed to use ASSIMP
+## ASSIMP_CMD - the full path of ASSIMP executable
+## ASSIMP_DYNAMIC_LIB - the Assimp dynamic lib (available only on windows as .dll file for the moment)
+##
+## Edited for using a bugfixed version of Assimp
+
+if(NOT ASSIMP_DIR)
+ set(ASSIMP_DIR "$ENV{ASSIMP_DIR}" CACHE PATH "ASSIMP root directory")
+ message("NO ASSIMP DIR " ASSIMP_DIR )
+ file(TO_CMAKE_PATH "/data/graphdeco/share/usr/local" ASSIMP_DIR)
+ set(ASSIMP_DIR "/data/graphdeco/share/usr/local" )
+ message("SETTING ASSIMP DIR " ASSIMP_DIR )
+endif()
+if(ASSIMP_DIR)
+ file(TO_CMAKE_PATH ${ASSIMP_DIR} ASSIMP_DIR)
+ file(TO_CMAKE_PATH "/data/graphdeco/share/usr/local" ASSIMP_DIR)
+ message("ASSIMP DIR " ASSIMP_DIR )
+endif()
+
+
+## set the LIB POSTFIX to find in a right directory according to what kind of compiler we use (32/64bits)
+if(CMAKE_SIZEOF_VOID_P EQUAL 8)
+ set(ASSIMP_SEARCH_LIB "lib64")
+ set(ASSIMP_SEARCH_BIN "bin64")
+ set(ASSIMP_SEARCH_LIB_PATHSUFFIXE "x64")
+else()
+ set(ASSIMP_SEARCH_LIB "lib32")
+ set(ASSIMP_SEARCH_BIN "bin32")
+ set(ASSIMP_SEARCH_LIB_PATHSUFFIXE "x86")
+endif()
+
+set(PROGRAMFILESx86 "PROGRAMFILES(x86)")
+
+
+FIND_PATH(ASSIMP_INCLUDE_DIR
+ NAMES assimp/config.h
+ PATHS
+ ${ASSIMP_DIR}
+ ## linux
+ /usr
+ /usr/include
+ /usr/local
+ /opt/local
+ ## windows
+ "$ENV{PROGRAMFILES}/Assimp"
+ "$ENV{${PROGRAMFILESx86}}/Assimp"
+ "$ENV{ProgramW6432}/Assimp"
+ PATH_SUFFIXES include
+)
+
+
+FIND_LIBRARY(ASSIMP_LIBRARY
+ NAMES assimp-vc140-mt assimp
+ PATHS
+ ${ASSIMP_DIR}/${ASSIMP_SEARCH_LIB}
+ ${ASSIMP_DIR}/lib
+ ${ASSIMP_DIR}/lib64
+ ## linux
+ /usr/${ASSIMP_SEARCH_LIB}
+ /usr/local/${ASSIMP_SEARCH_LIB}
+ /opt/local/${ASSIMP_SEARCH_LIB}
+ /usr/lib
+ /usr/lib64
+ /usr/local/lib
+ /opt/local/lib
+ ## windows
+ "$ENV{PROGRAMFILES}/Assimp/${ASSIMP_SEARCH_LIB}"
+ "$ENV{${PROGRAMFILESx86}}/Assimp/${ASSIMP_SEARCH_LIB}"
+ "$ENV{ProgramW6432}/Assimp/${ASSIMP_SEARCH_LIB}"
+ "$ENV{PROGRAMFILES}/Assimp/lib"
+ "$ENV{${PROGRAMFILESx86}}/Assimp/lib"
+ "$ENV{ProgramW6432}/Assimp/lib"
+ PATH_SUFFIXES ${ASSIMP_SEARCH_LIB_PATHSUFFIXE}
+)
+set(ASSIMP_LIBRARIES ${ASSIMP_LIBRARY})
+
+
+if(ASSIMP_LIBRARY)
+ get_filename_component(ASSIMP_LIBRARY_DIR ${ASSIMP_LIBRARY} PATH)
+ if(WIN32)
+ file(GLOB ASSIMP_DYNAMIC_LIB "${ASSIMP_LIBRARY_DIR}/assimp*.dll")
+ if(NOT ASSIMP_DYNAMIC_LIB)
+ message("ASSIMP_DYNAMIC_LIB is missing... at ${ASSIMP_LIBRARY_DIR}")
+ endif()
+ endif()
+ set(ASSIMP_DYNAMIC_LIB ${ASSIMP_DYNAMIC_LIB} CACHE PATH "Windows dll location")
+endif()
+
+MARK_AS_ADVANCED(ASSIMP_DYNAMIC_LIB ASSIMP_INCLUDE_DIR ASSIMP_LIBRARIES)
+
+INCLUDE(FindPackageHandleStandardArgs)
+FIND_PACKAGE_HANDLE_STANDARD_ARGS(ASSIMP
+ REQUIRED_VARS ASSIMP_INCLUDE_DIR ASSIMP_LIBRARIES
+ FAIL_MESSAGE "ASSIMP wasn't found correctly. Set ASSIMP_DIR to the root SDK installation directory."
+)
+
+if(NOT ASSIMP_FOUND)
+ set(ASSIMP_DIR "" CACHE STRING "Path to ASSIMP install directory")
+endif()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindEGL.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindEGL.cmake
new file mode 100644
index 0000000..41d45cb
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindEGL.cmake
@@ -0,0 +1,161 @@
+#.rst:
+# FindEGL
+# -------
+#
+# Try to find EGL.
+#
+# This will define the following variables:
+#
+# ``EGL_FOUND``
+# True if (the requested version of) EGL is available
+# ``EGL_VERSION``
+# The version of EGL; note that this is the API version defined in the
+# headers, rather than the version of the implementation (eg: Mesa)
+# ``EGL_LIBRARIES``
+# This can be passed to target_link_libraries() instead of the ``EGL::EGL``
+# target
+# ``EGL_INCLUDE_DIRS``
+# This should be passed to target_include_directories() if the target is not
+# used for linking
+# ``EGL_DEFINITIONS``
+# This should be passed to target_compile_options() if the target is not
+# used for linking
+#
+# If ``EGL_FOUND`` is TRUE, it will also define the following imported target:
+#
+# ``EGL::EGL``
+# The EGL library
+#
+# In general we recommend using the imported target, as it is easier to use.
+# Bear in mind, however, that if the target is in the link interface of an
+# exported library, it must be made available by the package config file.
+#
+# Since pre-1.0.0.
+
+#=============================================================================
+# Copyright 2014 Alex Merry
+# Copyright 2014 Martin Gräßlin
+#
+# Redistribution and use in source and binary forms, with or without
+# modification, are permitted provided that the following conditions
+# are met:
+#
+# 1. Redistributions of source code must retain the copyright
+# notice, this list of conditions and the following disclaimer.
+# 2. Redistributions in binary form must reproduce the copyright
+# notice, this list of conditions and the following disclaimer in the
+# documentation and/or other materials provided with the distribution.
+# 3. The name of the author may not be used to endorse or promote products
+# derived from this software without specific prior written permission.
+#
+# THIS SOFTWARE IS PROVIDED BY THE AUTHOR ``AS IS'' AND ANY EXPRESS OR
+# IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES
+# OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED.
+# IN NO EVENT SHALL THE AUTHOR BE LIABLE FOR ANY DIRECT, INDIRECT,
+# INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT
+# NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
+# DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
+# THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
+# (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF
+# THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
+#=============================================================================
+
+include(CheckCXXSourceCompiles)
+include(CMakePushCheckState)
+
+# Use pkg-config to get the directories and then use these values
+# in the FIND_PATH() and FIND_LIBRARY() calls
+find_package(PkgConfig)
+pkg_check_modules(PKG_EGL QUIET egl)
+
+set(EGL_DEFINITIONS ${PKG_EGL_CFLAGS_OTHER})
+
+find_path(EGL_INCLUDE_DIR
+ NAMES
+ EGL/egl.h
+ HINTS
+ ${PKG_EGL_INCLUDE_DIRS}
+)
+find_library(EGL_LIBRARY
+ NAMES
+ EGL
+ HINTS
+ ${PKG_EGL_LIBRARY_DIRS}
+)
+
+# NB: We do *not* use the version information from pkg-config, as that
+# is the implementation version (eg: the Mesa version)
+if(EGL_INCLUDE_DIR)
+ # egl.h has defines of the form EGL_VERSION_x_y for each supported
+ # version; so the header for EGL 1.1 will define EGL_VERSION_1_0 and
+ # EGL_VERSION_1_1. Finding the highest supported version involves
+ # finding all these defines and selecting the highest numbered.
+ file(READ "${EGL_INCLUDE_DIR}/EGL/egl.h" _EGL_header_contents)
+ string(REGEX MATCHALL
+ "[ \t]EGL_VERSION_[0-9_]+"
+ _EGL_version_lines
+ "${_EGL_header_contents}"
+ )
+ unset(_EGL_header_contents)
+ foreach(_EGL_version_line ${_EGL_version_lines})
+ string(REGEX REPLACE
+ "[ \t]EGL_VERSION_([0-9_]+)"
+ "\\1"
+ _version_candidate
+ "${_EGL_version_line}"
+ )
+ string(REPLACE "_" "." _version_candidate "${_version_candidate}")
+ if(NOT DEFINED EGL_VERSION OR EGL_VERSION VERSION_LESS _version_candidate)
+ set(EGL_VERSION "${_version_candidate}")
+ endif()
+ endforeach()
+ unset(_EGL_version_lines)
+endif()
+
+cmake_push_check_state(RESET)
+list(APPEND CMAKE_REQUIRED_LIBRARIES "${EGL_LIBRARY}")
+list(APPEND CMAKE_REQUIRED_INCLUDES "${EGL_INCLUDE_DIR}")
+
+check_cxx_source_compiles("
+#include
+
+int main(int argc, char *argv[]) {
+ EGLint x = 0; EGLDisplay dpy = 0; EGLContext ctx = 0;
+ eglDestroyContext(dpy, ctx);
+}" HAVE_EGL)
+
+cmake_pop_check_state()
+
+include(FindPackageHandleStandardArgs)
+find_package_handle_standard_args(EGL
+ FOUND_VAR
+ EGL_FOUND
+ REQUIRED_VARS
+ EGL_LIBRARY
+ EGL_INCLUDE_DIR
+ HAVE_EGL
+ VERSION_VAR
+ EGL_VERSION
+)
+
+if(EGL_FOUND AND NOT TARGET EGL::EGL)
+ add_library(EGL::EGL UNKNOWN IMPORTED)
+ set_target_properties(EGL::EGL PROPERTIES
+ IMPORTED_LOCATION "${EGL_LIBRARY}"
+ INTERFACE_COMPILE_OPTIONS "${EGL_DEFINITIONS}"
+ INTERFACE_INCLUDE_DIRECTORIES "${EGL_INCLUDE_DIR}"
+ )
+endif()
+
+mark_as_advanced(EGL_LIBRARY EGL_INCLUDE_DIR HAVE_EGL)
+
+# compatibility variables
+set(EGL_LIBRARIES ${EGL_LIBRARY})
+set(EGL_INCLUDE_DIRS ${EGL_INCLUDE_DIR})
+set(EGL_VERSION_STRING ${EGL_VERSION})
+
+include(FeatureSummary)
+set_package_properties(EGL PROPERTIES
+ URL "https://www.khronos.org/egl/"
+ DESCRIPTION "A platform-agnostic mechanism for creating rendering surfaces for use with other graphics libraries, such as OpenGL|ES and OpenVG."
+)
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindEmbree.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindEmbree.cmake
new file mode 100644
index 0000000..0d07237
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindEmbree.cmake
@@ -0,0 +1,94 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+## Important Note:
+## This is not an official Find*cmake. It has been written for searching through
+## a custom path (EMBREE_DIR) before checking elsewhere.
+##
+## FindEMBREE.cmake
+## Find EMBREE's includes and library
+##
+## This module defines :
+## [in] EMBREE_DIR, The base directory to search for EMBREE (as cmake var or env var)
+## [out] EMBREE_INCLUDE_DIR where to find EMBREE.h
+## [out] EMBREE_LIBRARIES, EMBREE_LIBRARY, libraries to link against to use EMBREE
+## [out] EMBREE_FOUND, If false, do not try to use EMBREE.
+##
+
+
+if(NOT EMBREE_DIR)
+ set(EMBREE_DIR "$ENV{EMBREE_DIR}" CACHE PATH "EMBREE root directory")
+endif()
+if(EMBREE_DIR)
+ file(TO_CMAKE_PATH ${EMBREE_DIR} EMBREE_DIR)
+endif()
+
+
+## set the LIB POSTFIX to find in a right directory according to what kind of compiler we use (32/64bits)
+if(CMAKE_SIZEOF_VOID_P EQUAL 8)
+ set(EMBREE_SEARCH_LIB "lib64")
+ set(EMBREE_SEARCH_BIN "bin64")
+ set(EMBREE_SEARCH_LIB_PATHSUFFIXE "x64")
+else()
+ set(EMBREE_SEARCH_LIB "lib32")
+ set(EMBREE_SEARCH_BIN "bin32")
+ set(EMBREE_SEARCH_LIB_PATHSUFFIXE "x86")
+endif()
+
+set(PROGRAMFILESx86 "PROGRAMFILES(x86)")
+
+FIND_PATH(EMBREE_INCLUDE_DIR
+ NAMES embree3/rtcore_geometry.h
+ PATHS
+ ${EMBREE_DIR}
+ ## linux
+ /usr
+ /usr/local
+ /opt/local
+ ## windows
+ "$ENV{PROGRAMFILES}/EMBREE"
+ "$ENV{${PROGRAMFILESx86}}/EMBREE"
+ "$ENV{ProgramW6432}/EMBREE"
+ PATH_SUFFIXES include
+)
+
+FIND_LIBRARY(EMBREE_LIBRARY
+ NAMES embree3
+ PATHS
+ ${EMBREE_DIR}/${EMBREE_SEARCH_LIB}
+ ${EMBREE_DIR}/lib
+ ## linux
+ /usr/${EMBREE_SEARCH_LIB}
+ /usr/local/${EMBREE_SEARCH_LIB}
+ /opt/local/${EMBREE_SEARCH_LIB}
+ /usr/lib
+ /usr/local/lib
+ /opt/local/lib
+ ## windows
+ "$ENV{PROGRAMFILES}/EMBREE/${EMBREE_SEARCH_LIB}"
+ "$ENV{${PROGRAMFILESx86}}/EMBREE/${EMBREE_SEARCH_LIB}"
+ "$ENV{ProgramW6432}/EMBREE/${EMBREE_SEARCH_LIB}"
+ "$ENV{PROGRAMFILES}/EMBREE/lib"
+ "$ENV{${PROGRAMFILESx86}}/EMBREE/lib"
+ "$ENV{ProgramW6432}/EMBREE/lib"
+ PATH_SUFFIXES ${EMBREE_SEARCH_LIB_PATHSUFFIXE}
+)
+set(EMBREE_LIBRARIES ${EMBREE_LIBRARY})
+
+MARK_AS_ADVANCED(EMBREE_INCLUDE_DIR EMBREE_LIBRARIES)
+
+INCLUDE(FindPackageHandleStandardArgs)
+FIND_PACKAGE_HANDLE_STANDARD_ARGS(EMBREE
+ REQUIRED_VARS EMBREE_INCLUDE_DIR EMBREE_LIBRARIES
+ FAIL_MESSAGE "EMBREE wasn't found correctly. Set EMBREE_DIR to the root SDK installation directory."
+)
+
+if(NOT EMBREE_FOUND)
+ set(EMBREE_DIR "" CACHE STRING "Path to EMBREE install directory")
+endif()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindFFMPEG.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindFFMPEG.cmake
new file mode 100644
index 0000000..e60cee8
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindFFMPEG.cmake
@@ -0,0 +1,110 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+## Try to find the FFMPEG library
+## Once done this will define
+##
+## FFMPEG_FOUND - system has FFmpeg
+## FFMPEG_INCLUDE_DIR - The FFmpeg include directory
+## FFMPEG_LIBRARIES - The libraries needed to use FFmpeg
+## FFMPEG_DYNAMIC_LIBS - DLLs for windows
+
+
+if(NOT FFMPEG_DIR)
+ set(FFMPEG_DIR "$ENV{FFMPEG_DIR}" CACHE PATH "FFMPEG_DIR root directory")
+endif()
+
+if(FFMPEG_DIR)
+ file(TO_CMAKE_PATH ${FFMPEG_DIR} FFMPEG_DIR)
+endif()
+
+MACRO(FFMPEG_FIND varname shortname headername)
+
+ # Path to include dirs
+ FIND_PATH(FFMPEG_${varname}_INCLUDE_DIRS
+ NAMES "lib${shortname}/${headername}"
+ PATHS
+ "${FFMPEG_DIR}/include" # modify this to adapt according to OS/compiler
+ "/usr/include"
+ "/usr/include/ffmpeg"
+ )
+
+ #Add libraries
+ IF(${FFMPEG_${varname}_INCLUDE_DIRS} STREQUAL "FFMPEG_${varname}_INCLUDE_DIR-NOTFOUND")
+ MESSAGE(STATUS "Can't find includes for ${shortname}...")
+ ELSE()
+ FIND_LIBRARY(FFMPEG_${varname}_LIBRARIES
+ NAMES ${shortname}
+ PATHS
+ ${FFMPEG_DIR}/lib
+ "/usr/lib"
+ "/usr/lib64"
+ "/usr/local/lib"
+ "/usr/local/lib64"
+ )
+
+ # set libraries and other variables
+ SET(FFMPEG_${varname}_FOUND 1)
+ SET(FFMPEG_${varname}_INCLUDE_DIRS ${FFMPEG_${varname}_INCLUDE_DIR})
+ SET(FFMPEG_${varname}_LIBS ${FFMPEG_${varname}_LIBRARIES})
+ ENDIF()
+ ENDMACRO(FFMPEG_FIND)
+
+#Calls to ffmpeg_find to get librarires ------------------------------
+FFMPEG_FIND(LIBAVFORMAT avformat avformat.h)
+FFMPEG_FIND(LIBAVDEVICE avdevice avdevice.h)
+FFMPEG_FIND(LIBAVCODEC avcodec avcodec.h)
+FFMPEG_FIND(LIBAVUTIL avutil avutil.h)
+FFMPEG_FIND(LIBSWSCALE swscale swscale.h)
+
+# check if libs are found and set FFMPEG related variables
+#SET(FFMPEG_FOUND "NO")
+IF(FFMPEG_LIBAVFORMAT_FOUND
+ AND FFMPEG_LIBAVDEVICE_FOUND
+ AND FFMPEG_LIBAVCODEC_FOUND
+ AND FFMPEG_LIBAVUTIL_FOUND
+ AND FFMPEG_LIBSWSCALE_FOUND)
+
+ # All ffmpeg libs are here
+ SET(FFMPEG_FOUND "YES")
+ SET(FFMPEG_INCLUDE_DIR ${FFMPEG_LIBAVFORMAT_INCLUDE_DIRS})
+ SET(FFMPEG_LIBRARY_DIRS ${FFMPEG_LIBAVFORMAT_LIBRARY_DIRS})
+ SET(FFMPEG_LIBRARIES
+ ${FFMPEG_LIBAVFORMAT_LIBS}
+ ${FFMPEG_LIBAVDEVICE_LIBS}
+ ${FFMPEG_LIBAVCODEC_LIBS}
+ ${FFMPEG_LIBAVUTIL_LIBS}
+ ${FFMPEG_LIBSWSCALE_LIBS} )
+
+ # add dynamic libraries
+ if(WIN32)
+ file(GLOB FFMPEG_DYNAMIC_LIBS "${FFMPEG_DIR}/bin/*.dll")
+ if(NOT FFMPEG_DYNAMIC_LIBS)
+ message("FFMPEG_DYNAMIC_LIBS is missing...")
+ endif()
+ set(FFMPEG_DYNAMIC_LIBS ${FFMPEG_DYNAMIC_LIBS} CACHE PATH "Windows dll location")
+endif()
+
+ mark_as_advanced(FFMPEG_INCLUDE_DIR FFMPEG_LIBRARY_DIRS FFMPEG_LIBRARIES FFMPEG_DYNAMIC_LIBS)
+ELSE ()
+ MESSAGE(STATUS "Could not find FFMPEG")
+ENDIF()
+
+
+INCLUDE(FindPackageHandleStandardArgs)
+FIND_PACKAGE_HANDLE_STANDARD_ARGS(FFMPEG
+ REQUIRED_VARS FFMPEG_INCLUDE_DIR FFMPEG_LIBRARIES
+ FAIL_MESSAGE "FFmpeg wasn't found correctly. Set FFMPEG_DIR to the root SDK installation directory."
+)
+
+if(NOT FFMPEG_FOUND)
+ set(FFMPEG_DIR "" CACHE STRING "Path to FFmpeg install directory")
+endif()
+
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindGLFW.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindGLFW.cmake
new file mode 100644
index 0000000..14263de
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Modules/FindGLFW.cmake
@@ -0,0 +1,109 @@
+##=============================================================================
+##
+## Copyright (c) Kitware, Inc.
+## All rights reserved.
+## See LICENSE.txt for details.
+##
+## This software is distributed WITHOUT ANY WARRANTY; without even
+## the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR
+## PURPOSE. See the above copyright notice for more information.
+##
+## Copyright 2016 Sandia Corporation.
+## Copyright 2016 UT-Battelle, LLC.
+## Copyright 2016 Los Alamos National Security.
+##
+## Under the terms of Contract DE-AC04-94AL85000 with Sandia Corporation,
+## the U.S. Government retains certain rights in this software.
+## Under the terms of Contract DE-AC52-06NA25396 with Los Alamos National
+## Laboratory (LANL), the U.S. Government retains certain rights in
+## this software.
+##
+##=============================================================================
+# Try to find EGL library and include dir.
+# Once done this will define
+#
+# GLFW_FOUND
+# GLFW_INCLUDE_DIR
+# GLFW_LIBRARY
+#
+
+include(FindPackageHandleStandardArgs)
+
+if (WIN32)
+ find_path( GLFW_INCLUDE_DIR
+ NAMES
+ GLFW/glfw3.h
+ PATHS
+ ${PROJECT_SOURCE_DIR}/shared_external/glfw/include
+ ${PROJECT_SOURCE_DIR}/../shared_external/glfw/include
+ ${GLFW_LOCATION}/include
+ $ENV{GLFW_LOCATION}/include
+ $ENV{PROGRAMFILES}/GLFW/include
+ ${GLFW_LOCATION}
+ $ENV{GLFW_LOCATION}
+ DOC "The directory where GLFW/glfw3.h resides" )
+ if(ARCH STREQUAL "x86")
+ find_library( GLFW_LIBRARY
+ NAMES
+ glfw3
+ PATHS
+ ${GLFW_LOCATION}/lib
+ $ENV{GLFW_LOCATION}/lib
+ $ENV{PROGRAMFILES}/GLFW/lib
+ DOC "The GLFW library")
+ else()
+ find_library( GLFW_LIBRARY
+ NAMES
+ glfw3
+ PATHS
+ ${GLFW_LOCATION}/lib
+ $ENV{GLFW_LOCATION}/lib
+ $ENV{PROGRAMFILES}/GLFW/lib
+ DOC "The GLFW library")
+ endif()
+endif ()
+
+if (${CMAKE_HOST_UNIX})
+ message("GFLW LOCATION " $ENV{GLFW_LOCATION} )
+ find_path( GLFW_INCLUDE_DIR
+ NAMES
+ GLFW/glfw3.h
+ PATHS
+# ${GLFW_LOCATION}/include
+ $ENV{GLFW_LOCATION}/include
+# /usr/include
+# /usr/local/include
+# /sw/include
+# /opt/local/include
+# NO_DEFAULT_PATH
+ DOC "The directory where GLFW/glfw3.h resides"
+ )
+ find_library( GLFW_LIBRARY
+ NAMES
+ glfw3 glfw
+ PATHS
+# ${GLFW_LOCATION}/lib
+ $ENV{GLFW_LOCATION}/lib
+ $ENV{GLFW_LOCATION}/lib64
+# /usr/lib64
+# /usr/lib
+# /usr/local/lib64
+# /usr/local/lib
+# /sw/lib
+# /opt/local/lib
+# /usr/lib/x86_64-linux-gnu
+# NO_DEFAULT_PATH
+ DOC "The GLFW library")
+
+ set( GLFW_INCLUDE_DIR $ENV{GLFW_LOCATION}/include )
+ set( GLFW_LIBRARY $ENV{GLFW_LOCATION}/lib64/libglfw3.a )
+ message("*************==========> FindGLFW .cmake " ${GLFW_INCLUDE_DIR} " LIB " ${GLFW_LIBRARY} )
+endif ()
+
+find_package_handle_standard_args(GLFW DEFAULT_MSG
+ GLFW_INCLUDE_DIR
+ GLFW_LIBRARY
+)
+
+mark_as_advanced( GLFW_FOUND )
+
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Win3rdParty.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Win3rdParty.cmake
new file mode 100644
index 0000000..7e42fbb
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/Win3rdParty.cmake
@@ -0,0 +1,337 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+## This file should be include and use only on WIN32 OS and once
+## It allow to auto check/download and use a preconfigured 3rdParty binaries for cmake usage
+## It use the downloadAndExtractZipFile cmake module to work.
+##
+if(__Win3rdParty_cmake_INCLUDED__)
+ return()
+else()
+ set(__Win3rdParty_cmake_INCLUDED__ ON)
+endif()
+
+
+##
+## To be sure to reset an empty cached variable but keep any other kind of variables
+##
+## Usage:
+## check_cached_var( [FORCE])
+##
+## is the cached cmake variable you need to reset
+## is the new default value of the reseted cached cmake variable
+## is the kind of GUI cache input can be : FILEPATH; PATH; STRING or BOOL
+## is the associated GUI cache input documentation display in the GUI
+## FORCE option could be use to reset a cached variable even if it is not empty.
+##
+macro(check_cached_var var resetedCachedValue cacheType cacheDoc)
+ # message(STATUS "inside check_cached_var macro. argn=${ARGN}")
+ cmake_parse_arguments(ccv "FORCE" "" "" ${ARGN})
+
+ if(ccv_FORCE)
+ set(FORCE FORCE)
+ else()
+ set(FORCE )
+ endif()
+
+ if(NOT ${var} OR ccv_FORCE)
+ unset(${var} CACHE)
+ # message(STATUS "setting new cache value. var ${var} = ${resetedCachedValue}")
+ set(${var} "${resetedCachedValue}" CACHE ${cacheType} "${cacheDoc}" ${FORCE})
+ endif()
+endmacro()
+
+
+##
+## Win3rdParty function allow to specify a directory which contain all necessary windows dependenties.
+## By uploading 3rdParty directory (which contain dependencies, *.lib, *.dll... for a specific version of compiler) onto Gforge file tab,
+## you get back an URL of download you can give to this function with a directory name. So you can provide multiple 3rdParty version of same dependencies (MSVC11, MSVC12...).
+## By providing a prefix to this function, you allow to use different kind of 3rdParty which can be handled by CMAKE OPTIONS depending on what your framework need for example.
+##
+## Usage 1:
+## Win3rdParty( MSVC
+## [MSVC] [...]
+## [VCID] [DEFAULT_USE] [VERBOSE] )
+##
+## * allow to identify which 3rdParty you process (prefix name)
+## * MSVC flag could be MSVC11 or MSVC12 (any element of the MSVC_VERSIONS_LIST) and refer to a 3rdParty compiler with :
+## * which will be the local pathName of the downloaded 3rdParty : relative to CMAKE_BINARY_DIR
+## * which is the link location of the 3rdParty zip
+## * VCID flag will make available a cache variable ${prefix}_WIN3RDPARTY_VCID
+## * DEFAULT_USE flag [ON|OFF] may be used to set default value of cmake cached variable : _WIN3RDPARTY_USE [default to ON]
+##
+## WARNING:
+## This function define CACHE variables you can use after :
+## * ${prefix}_WIN3RDPARTY_USE : allow to check/downloaded win3rdParty dir (it will force the cached variables for this dependency folder generally _DIR>)
+## * ${prefix}_WIN3RDPARTY_DIR : where is your local win3rdParty dir (the PATH)
+## * ${prefix}_WIN3RDPARTY_VCID : [if VCID flag is used] the MSVC id (commonly used to prefix/suffix library name, see boost or CGAL)
+##
+## If you want to add a win3rdParty version, please:
+## 1- build dependencies on your local side with the compiler you want
+## 2- build your own zip with your built dependencies
+## 3- upload it (onto the forge where the project is stored) and copy the link location in order to use it for this function
+## 4- if you just introduced a new MSVC version, add it to the MSVC_VERSIONS_LIST bellow
+##
+## In a second pass, you can also use this function to set necessary cmake cached variables in order to let cmake find packages of these 3rdParty.
+##
+## Usage 2:
+## win3rdParty( [VERBOSE] MULTI_SET|SET
+## CHECK_CACHED_VAR [LIST] [DOC ]
+## [ CHECK_CACHED_VAR [LIST] [DOC ] ] [...]
+##
+## * MULTI_SET or SET flags are used to tell cmake that all next arguments will use repeated flags with differents entries (SET mean we will provide only one set of arguments, without repetition)
+## * CHECK_CACHED_VAR are the repeated flag which contain differents entries
+## * is the cmake variable you want to be cached for the project
+## * is the kind of cmake variable (couble be: FILEPATH; PATH; STRING or BOOL) => see check_cached_var.
+## * LIST optional flag could be used with CHECK_CACHED_VAR when = STRING. It allow to handle multiple STRINGS value list.
+## * is the value of the variable (if FILEPATH, PATH or STRING: use quotes, if BOOL : use ON/OFF)
+## * DOC optional flag is used to have a tooltips info about this new cmake variable entry into the GUI (use quotes).
+##
+## Full example 1 :
+## win3rdParty(COMMON MSVC11 "win3rdParty-MSVC11" "https://path.to/an.archive.7z"
+## SET CHECK_CACHED_VAR SuiteSparse_DIR PATH "SuiteSparse-4.2.1" DOC "default empty doc"
+## )
+##
+## WARNING:
+## For the 2nd usage (with MULTI_SET), if you planned to set some CACHED_VAR using/composed by ${prefix}_WIN3RDPARTY_* just set in this macro (usage 1),
+## then (due to the not yet existing var) you will need to call this function 2 times :
+## One for the 1st usage (downloading of the current compiler 3rdParty).
+## One for the MLUTI_SET flag which will use existsing ${prefix}_WIN3RDPARTY_* cached var.
+##
+## Full example 2 :
+## win3rdParty(COMMON MSVC11 "win3rdParty-MSVC11" "https://path.to/an.archive.7z")
+## win3rdParty(COMMON MULTI_SET
+## CHECK_CACHED_VAR CGAL_INCLUDE_DIR PATH "CGAL-4.3/include" DOC "default empty doc"
+## CHECK_CACHED_VAR CGAL_LIBRARIES STRING LIST "debug;CGAL-4.3/lib${LIB_POSTFIX}/CGAL-${WIN3RDPARTY_COMMON_VCID}-mt-gd-4.3.lib;optimized;CGAL-4.3/lib${LIB_POSTFIX}/CGAL-${WIN3RDPARTY_COMMON_VCID}-mt-4.3.lib"
+##
+##
+## WARNING: This function use internaly :
+## * downloadAndExtractZipFile.cmake
+## * parse_arguments_multi.cmake
+## * check_cached_var macro
+##
+function(win3rdParty prefix )
+
+ # ARGV: list of all arguments given to the macro/function
+ # ARGN: list of remaining arguments
+
+ if(NOT WIN32)
+ return()
+ endif()
+
+ ## set the handled version of MSVC
+ ## if you plan to add a win3rdParty dir to download with a new MSVC version: build the win3rdParty dir and add the MSCV entry here.
+ set(MSVC_VERSIONS_LIST "MSVC17;MSVC11;MSVC12;MSVC14")
+
+ #include(CMakeParseArguments) # CMakeParseArguments is obsolete since cmake 3.5
+ # cmake_parse_arguments ( args)
+ # : options (flags) pass to the macro
+ # : options that neeed a value
+ # : options that neeed more than one value
+ cmake_parse_arguments(w3p "VCID" "VERBOSE;TIMEOUT;DEFAULT_USE" "${MSVC_VERSIONS_LIST};MULTI_SET;SET" ${ARGN})
+
+ # message(STATUS "value of w3p_VCID = ${w3p_VCID}")
+ # message(STATUS "value of w3p_VERBOSE = ${w3p_VERBOSE}")
+ # message(STATUS "value of w3p_TIMEOUT = ${w3p_TIMEOUT}")
+ # message(STATUS "value of w3p_DEFAULT_USE = ${w3p_DEFAULT_USE}")
+
+ # foreach (loop_var ${MSVC_VERSIONS_LIST})
+ # message(STATUS "value of w3p_${loop_var} = ${w3p_${loop_var}}")
+ # endforeach(loop_var)
+
+ # message(STATUS "value of w3p_MULTI_SET = ${w3p_MULTI_SET}")
+ # message(STATUS "value of w3p_SET = ${w3p_SET}")
+
+ # message("values for MSVC = ${w3p_MSVC14}")
+
+ if(NOT w3p_TIMEOUT)
+ set(w3p_TIMEOUT 300)
+ endif()
+
+ if(NOT DEFINED w3p_DEFAULT_USE)
+ set(w3p_DEFAULT_USE ON)
+ endif()
+
+
+ ## 1st use (check/update|download) :
+ set(${prefix}_WIN3RDPARTY_USE ${w3p_DEFAULT_USE} CACHE BOOL "Use required 3rdParty binaries from ${prefix}_WIN3RDPARTY_DIR or download it if not exist")
+
+
+ ## We want to test if each version of MSVC was filled by the function (see associated parameters)
+ ## As CMake is running only for one version of MSVC, if that MSVC version was filled, we get back associated parameters,
+ ## otherwise we can't use the downloadAndExtractZipFile with win3rdParty.
+ set(enableWin3rdParty OFF)
+
+ foreach(MSVC_VER ${MSVC_VERSIONS_LIST})
+ if(${MSVC_VER} AND w3p_${MSVC_VER} OR ${MSVC_TOOLSET_VERSION} EQUAL 143 AND ${MSVC_VER} STREQUAL "MSVC17")
+ list(LENGTH w3p_${MSVC_VER} count)
+ if("${count}" LESS "2")
+ #message(WARNING "You are using ${MSVC_VER} with ${prefix}_WIN3RDPARTY_USE=${${prefix}_WIN3RDPARTY_USE}, but win3rdParty function isn't filled for ${MSVC_VER}!")
+ else()
+ list(GET w3p_${MSVC_VER} 0 Win3rdPartyName)
+ list(GET w3p_${MSVC_VER} 1 Win3rdPartyUrl)
+ if(w3p_VCID)
+ ## try to get the VcId of MSVC. See also MSVC_VERSION cmake var in the doc.
+ string(REGEX REPLACE "MS([A-Za-z_0-9-]+)" "\\1" vcId ${MSVC_VER})
+ string(TOLOWER ${vcId} vcId)
+ set(${prefix}_WIN3RDPARTY_VCID "${vcId}0" CACHE STRING "the MSVC id (commonly used to prefix/suffix library name, see boost or CGAL)")
+ mark_as_advanced(${prefix}_WIN3RDPARTY_VCID)
+ endif()
+ set(enableWin3rdParty ON)
+ set(suffixCompilerID ${MSVC_VER})
+ break()
+ endif()
+ endif()
+ endforeach()
+ ## If previous step succeed to get MSVC dirname and URL of the current MSVC version, use it to auto download/update the win3rdParty dir
+ if(enableWin3rdParty AND ${prefix}_WIN3RDPARTY_USE)
+
+ if(IS_ABSOLUTE "${Win3rdPartyName}")
+ else()
+ set(Win3rdPartyName "${CMAKE_BINARY_DIR}/${Win3rdPartyName}")
+ endif()
+
+ if(NOT EXISTS "${Win3rdPartyName}")
+ file(MAKE_DIRECTORY ${Win3rdPartyName})
+ endif()
+
+ include(downloadAndExtractZipFile)
+ downloadAndExtractZipFile( "${Win3rdPartyUrl}" ## URL link location
+ "Win3rdParty-${prefix}-${suffixCompilerID}.7z" ## where download it: relative path, so default to CMAKE_BINARY_DIR
+ "${Win3rdPartyName}" ## where extract it : fullPath (default relative to CMAKE_BINARY_DIR)
+ CHECK_DIRTY_URL "${Win3rdPartyName}/Win3rdPartyUrl" ## last downloaded url file : fullPath (default relative to CMAKE_BINARY_DIR)
+ TIMEOUT ${w3p_TIMEOUT}
+ VERBOSE ${w3p_VERBOSE}
+ )
+ file(GLOB checkDl "${Win3rdPartyName}/*")
+ list(LENGTH checkDl checkDlCount)
+ if("${checkDlCount}" GREATER "1")
+ else()
+ message("The downloadAndExtractZipFile didn't work...?")
+ set(enableWin3rdParty OFF)
+ endif()
+ endif()
+
+ ## Try to auto set ${prefix}_WIN3RDPARTY_DIR or let user set it manually
+ set(${prefix}_WIN3RDPARTY_DIR "" CACHE PATH "windows ${Win3rdPartyName} dir to ${prefix} dependencies of the project")
+
+ if(NOT ${prefix}_WIN3RDPARTY_DIR AND ${prefix}_WIN3RDPARTY_USE)
+ if(EXISTS "${Win3rdPartyName}")
+ unset(${prefix}_WIN3RDPARTY_DIR CACHE)
+ set(${prefix}_WIN3RDPARTY_DIR "${Win3rdPartyName}" CACHE PATH "dir to ${prefix} dependencies of the project")
+ endif()
+ endif()
+
+ if(EXISTS ${${prefix}_WIN3RDPARTY_DIR})
+ message(STATUS "Found a 3rdParty ${prefix} dir : ${${prefix}_WIN3RDPARTY_DIR}.")
+ set(enableWin3rdParty ON)
+ elseif(${prefix}_WIN3RDPARTY_USE)
+ message(WARNING "${prefix}_WIN3RDPARTY_USE=${${prefix}_WIN3RDPARTY_USE} but ${prefix}_WIN3RDPARTY_DIR=${${prefix}_WIN3RDPARTY_DIR}.")
+ set(enableWin3rdParty OFF)
+ endif()
+
+ ## Final check
+ if(NOT enableWin3rdParty)
+ message("Disable ${prefix}_WIN3RDPARTY_USE (cmake cached var will be not set), due to a win3rdParty problem.")
+ message("You still can set ${prefix}_WIN3RDPARTY_DIR to an already downloaded Win3rdParty directory location.")
+ set(${prefix}_WIN3RDPARTY_USE OFF CACHE BOOL "Use required 3rdParty binaries from ${prefix}_WIN3RDPARTY_DIR or download it if not exist" FORCE)
+ endif()
+
+ ## 2nd use : handle multi values args to set cached cmake variables in order to ease the next find_package call
+ if(${prefix}_WIN3RDPARTY_USE AND ${prefix}_WIN3RDPARTY_DIR)
+ if(w3p_VERBOSE)
+ message(STATUS "Try to set cmake cached variables for ${prefix} required libraries directly from : ${${prefix}_WIN3RDPARTY_DIR}.")
+ endif()
+
+ include(parse_arguments_multi)
+ # message (STATUS "before defining an override of parse_arguments_multi_function")
+ function(parse_arguments_multi_function ) ## overloaded function to handle all CHECK_CACHED_VAR values list (see: parse_arguments_multi)
+ # message(STATUS "inside overloaded parse_arguments_multi_function defined in Win3rdParty.cmake")
+ # message(STATUS ${ARGN})
+ ## we know the function take 3 args : var cacheType resetedCachedValue (see check_cached_var)
+ cmake_parse_arguments(pamf "" "DOC" "LIST" ${ARGN})
+
+ ## var and cacheType are mandatory (with the resetedCachedValue)
+ set(var ${ARGV0})
+ set(cacheType ${ARGV1})
+ # message(STATUS "var=${var} and cacheType=${cacheType} list=${pamf_LIST}")
+ if(pamf_DOC)
+ set(cacheDoc ${pamf_DOC})
+ else()
+ set(cacheDoc "")
+ endif()
+ if(pamf_LIST)
+ set(value ${pamf_LIST})
+ else()
+ # message("USING ARGV2 with value ${ARGV2}")
+ set(value ${ARGV2})
+ endif()
+ # message("inside override function in Win3rdparty.cmake value+ ${value}")
+ if("${cacheType}" MATCHES "PATH" AND EXISTS "${${prefix}_WIN3RDPARTY_DIR}/${value}")
+ # message("math with path")
+ set(resetedCachedValue "${${prefix}_WIN3RDPARTY_DIR}/${value}") ## path relative to ${prefix}_WIN3RDPARTY_DIR
+ elseif ("${cacheType}" MATCHES "PATH" AND EXISTS "${${prefix}_WIN3RDPARTY_DIR}")
+ set(resetedCachedValue "${${prefix}_WIN3RDPARTY_DIR}") ## path relative to ${prefix}_WIN3RDPARTY_DIR
+ elseif("${cacheType}" MATCHES "STRING")
+ foreach(var IN LISTS value)
+ if(EXISTS "${${prefix}_WIN3RDPARTY_DIR}/${var}")
+ list(APPEND resetedCachedValue "${${prefix}_WIN3RDPARTY_DIR}/${var}") ## string item of the string list is a path => make relative to ${prefix}_WIN3RDPARTY_DIR
+ else()
+ list(APPEND resetedCachedValue ${var}) ## string item of the string list is not an existing path => simply use the item
+ endif()
+ endforeach()
+ else()
+ set(resetedCachedValue "${value}") ## could be a BOOL or a STRING
+ endif()
+
+ ## call our macro to reset cmake cache variable if empty
+ check_cached_var(${var} "${resetedCachedValue}" ${cacheType} "${cacheDoc}" FORCE)
+
+ endfunction()
+ # message (STATUS "after defining an override of parse_arguments_multi_function")
+
+ if(w3p_MULTI_SET)
+ parse_arguments_multi(CHECK_CACHED_VAR w3p_MULTI_SET ${w3p_MULTI_SET}) ## internaly will call our overloaded parse_arguments_multi_function
+ elseif(w3p_SET)
+ # message("calling set version of parse_arguments_multi with w3p_set = ${w3p_SET}")
+ parse_arguments_multi(CHECK_CACHED_VAR w3p_SET ${w3p_SET})
+ endif()
+
+ endif()
+
+endfunction()
+
+## cmake variables introspection to globally activate/deactivate ${prefix}_WIN3RDPARTY_USE
+## This "one shot" call (only one for the next cmake configure) will automatically then reset the global variable WIN3RDPARTY_USE to UserDefined (do nothing).
+## use (call it) before and after the call of all your win3rdParty functions
+function(Win3rdPartyGlobalCacheAction )
+ set(WIN3RDPARTY_USE "UserDefined" CACHE STRING "Choose how to handle all cmake cached *_WIN3RDPARTY_USE for the next configure.\nCould be:\nUserDefined [default]\nActivateAll\nDesactivateAll" )
+ set_property(CACHE WIN3RDPARTY_USE PROPERTY STRINGS "UserDefined;ActivateAll;DesactivateAll" )
+ if(${WIN3RDPARTY_USE} MATCHES "UserDefined")
+ else()
+ if(${WIN3RDPARTY_USE} MATCHES "ActivateAll")
+ set(win3rdPvalue ON)
+ elseif(${WIN3RDPARTY_USE} MATCHES "DesactivateAll")
+ set(win3rdPvalue OFF)
+ endif()
+ get_cmake_property(_variableNames CACHE_VARIABLES)
+ foreach (_variableName ${_variableNames})
+ string(REGEX MATCH "[A-Za-z_0-9-]+_WIN3RDPARTY_USE" win3rdpartyUseCacheVar ${_variableName})
+ if(win3rdpartyUseCacheVar)
+ string(REGEX REPLACE "([A-Za-z_0-9-]+_WIN3RDPARTY_USE)" "\\1" win3rdpartyUseCacheVar ${_variableName})
+ set(${win3rdpartyUseCacheVar} ${win3rdPvalue} CACHE BOOL "Use required 3rdParty binaries from ${prefix}_WIN3RDPARTY_DIR or download it if not exist" FORCE)
+ message(STATUS "${win3rdpartyUseCacheVar} cached variable set to ${win3rdPvalue}.")
+ endif()
+ endforeach()
+ set(WIN3RDPARTY_USE "UserDefined" CACHE STRING "Choose how to handle all cmake cached *_WIN3RDPARTY_USE for the next configure.\nCould be:\nUserDefined [default]\nActivateAll\nDesactivateAll" FORCE)
+ message(STATUS "reset WIN3RDPARTY_USE to UserDefined.")
+ endif()
+ mark_as_advanced(WIN3RDPARTY_USE)
+endfunction()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/cmake_policies.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/cmake_policies.cmake
new file mode 100644
index 0000000..679fd84
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/cmake_policies.cmake
@@ -0,0 +1,19 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+if(__set_policies_INCLUDED__)
+ return()
+else()
+ set(__set_policies_INCLUDED__ ON)
+endif()
+
+macro(setPolicies)
+ # No more policies to enforce
+endmacro()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/dependencies.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/dependencies.cmake
new file mode 100644
index 0000000..a7854bb
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/dependencies.cmake
@@ -0,0 +1,324 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+## Included once for all sub project.
+## It contain the whole cmake instructions to find necessary common dependencies.
+## 3rdParty (provided by sibr_addlibrary win3rdParty or from external packages) are then available in cmake sub projects.
+##
+## Do not include this file more than once but you can modify it to fit to your own project.
+## So please, read it carefully because you can use on of these dependencies for your project or appen new one.
+##
+## As it is included after camke options, you can use conditional if()/endif() to encapsulate your 3rdParty.
+##
+
+## win3rdParty function allowing to auto check/download/update binaries dependencies for current windows compiler
+## Please open this file in order to get more documentation and usage examples.
+include(Win3rdParty)
+
+include(sibr_library)
+
+Win3rdPartyGlobalCacheAction()
+
+find_package(OpenGL REQUIRED)
+
+set(OpenGL_GL_PREFERENCE "GLVND")
+
+############
+## Find GLEW
+############
+##for headless rendering
+find_package(EGL QUIET)
+
+if(EGL_FOUND)
+ add_definitions(-DGLEW_EGL)
+ message("Activating EGL support for headless GLFW/GLEW")
+else()
+ message("EGL not found : EGL support for headless GLFW/GLEW is disabled")
+endif()
+
+if (MSVC11 OR MSVC12)
+ set(glew_multiset_arguments
+ CHECK_CACHED_VAR GLEW_INCLUDE_DIR PATH "glew-1.10.0/include" DOC "default empty doc"
+ CHECK_CACHED_VAR GLEW_LIBRARIES STRING LIST "debug;glew-1.10.0/${LIB_BUILT_DIR}/glew32d.lib;optimized;glew-1.10.0/${LIB_BUILT_DIR}/glew32.lib" DOC "default empty doc"
+ )
+elseif (MSVC14)
+ set(glew_multiset_arguments
+ CHECK_CACHED_VAR GLEW_INCLUDE_DIR PATH "glew-2.0.0/include" DOC "default empty doc"
+ CHECK_CACHED_VAR GLEW_SHARED_LIBRARY_RELEASE PATH "glew-2.0.0/${LIB_BUILT_DIR}/glew32.lib"
+ CHECK_CACHED_VAR GLEW_STATIC_LIBRARY_RELEASE PATH "glew-2.0.0/${LIB_BUILT_DIR}/glew32s.lib"
+ CHECK_CACHED_VAR GLEW_SHARED_LIBRARY_DEBUG PATH "glew-2.0.0/${LIB_BUILT_DIR}/glew32d.lib"
+ CHECK_CACHED_VAR GLEW_STATIC_LIBRARY_DEBUG PATH "glew-2.0.0/${LIB_BUILT_DIR}/glew32sd.lib"
+ )
+else ()
+ message("There is no provided GLEW library for your compiler, relying on find_package to find it")
+endif()
+sibr_addlibrary(NAME GLEW #VERBOSE ON
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/glew-1.10.0.7z"
+ MSVC12 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/glew-1.10.0.7z"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/glew-2.0.0.7z" # using recompiled version of glew
+ MULTI_SET ${glew_multiset_arguments}
+)
+set(GLEW_VERBOSE ON)
+FIND_PACKAGE(GLEW REQUIRED)
+IF(GLEW_FOUND)
+ INCLUDE_DIRECTORIES(${GLEW_INCLUDE_DIR})
+ELSE(GLEW_FOUND)
+ MESSAGE("GLEW not found. Set GLEW_DIR to base directory of GLEW.")
+ENDIF(GLEW_FOUND)
+
+
+##############
+## Find ASSIMP
+##############
+if (MSVC11 OR MSVC12)
+ set(assimp_set_arguments
+ CHECK_CACHED_VAR ASSIMP_DIR PATH "Assimp_3.1_fix"
+ )
+elseif (MSVC14)
+ set(assimp_set_arguments
+ CHECK_CACHED_VAR ASSIMP_DIR PATH "Assimp-4.1.0"
+ )
+else ()
+ message("There is no provided ASSIMP library for your compiler, relying on find_package to find it")
+endif()
+
+sibr_addlibrary(NAME ASSIMP #VERBOSE ON
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/Assimp_3.1_fix.7z"
+ MSVC12 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/Assimp_3.1_fix.7z"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/Assimp-4.1.0.7z"
+ MULTI_SET
+ ${assimp_set_arguments}
+)
+
+find_package(ASSIMP REQUIRED)
+include_directories(${ASSIMP_INCLUDE_DIR})
+
+################
+## Find FFMPEG
+################
+sibr_addlibrary(NAME FFMPEG
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/ffmpeg.zip"
+ MSVC12 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/ffmpeg.zip"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/ffmpeg-4.0.2-win64-win3rdParty.7z"
+ SET CHECK_CACHED_VAR FFMPEG_DIR PATH ${FFMPEG_WIN3RDPARTY_DIR}
+)
+find_package(FFMPEG)
+include_directories(${FFMPEG_INCLUDE_DIR})
+## COMMENT OUT ALL FFMPEG FOR CLUSTER
+
+###################
+## Find embree3
+###################
+sibr_addlibrary(
+ NAME embree3
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/embree2.7.0.x64.windows.7z"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/embree-3.6.1.x64.vc14.windows.7z" # TODO SV: provide a valid version if required
+)
+
+# CLUSTER
+#find_package(embree 3.0 REQUIRED PATHS "/data/graphdeco/share/embree/usr/local/lib64/cmake/" )
+find_package(embree 3.0 )
+
+###################
+## Find eigen3
+###################
+sibr_addlibrary(
+ NAME eigen3
+ #MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/eigen-eigen-dc6cfdf9bcec.7z"
+ #MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/eigen-eigen-dc6cfdf9bcec.7z" # TODO SV: provide a valid version if required
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/eigen3.7z"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/eigen3.7z"
+ SET CHECK_CACHED_VAR eigen3_DIR PATH "eigen/share/eigen3/cmake"
+)
+include_directories(/usr/include/eigen3)
+add_definitions(-DEIGEN_INITIALIZE_MATRICES_BY_ZERO)
+
+#############
+## Find Boost
+#############
+set(Boost_REQUIRED_COMPONENTS "system;chrono;filesystem;date_time" CACHE INTERNAL "Boost Required Components")
+
+if (WIN32)
+ # boost multiset arguments
+ if (MSVC11 OR MSVC12)
+ set(boost_multiset_arguments
+ CHECK_CACHED_VAR BOOST_ROOT PATH "boost_1_55_0"
+ CHECK_CACHED_VAR BOOST_INCLUDEDIR PATH "boost_1_55_0"
+ CHECK_CACHED_VAR BOOST_LIBRARYDIR PATH "boost_1_55_0/${LIB_BUILT_DIR}"
+ #CHECK_CACHED_VAR Boost_COMPILER STRING "-${Boost_WIN3RDPARTY_VCID}" DOC "vcid (eg: -vc110 for MSVC11)"
+ CHECK_CACHED_VAR Boost_COMPILER STRING "-vc110" DOC "vcid (eg: -vc110 for MSVC11)" # NOTE: if it doesnt work, uncomment this option and set the right value for VisualC id
+ )
+ elseif (MSVC14)
+ set(boost_multiset_arguments
+ CHECK_CACHED_VAR BOOST_ROOT PATH "boost-1.71"
+ CHECK_CACHED_VAR BOOST_INCLUDEDIR PATH "boost-1.71"
+ CHECK_CACHED_VAR BOOST_LIBRARYDIR PATH "boost-1.71/${LIB_BUILT_DIR}"
+ CHECK_CACHED_VAR Boost_COMPILER STRING "-vc141" DOC "vcid (eg: -vc110 for MSVC11)" # NOTE: if it doesnt work, uncomment this option and set the right value for VisualC id
+ )
+
+ option(BOOST_MINIMAL_VERSION "Only get minimal Boost dependencies" ON)
+
+ if(${BOOST_MINIMAL_VERSION})
+ set(BOOST_MSVC14_ZIP "boost-1.71-ibr-minimal.7z")
+ else()
+ set(BOOST_MSVC14_ZIP "boost-1.71.7z")
+ endif()
+ else ()
+ message("There is no provided Boost library for your compiler, relying on find_package to find it")
+ endif()
+
+ sibr_addlibrary(NAME Boost VCID TIMEOUT 600 #VERBOSE ON
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/boost_1_55_0.7z"
+ MSVC12 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/boost_1_55_0.7z"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/${BOOST_MSVC14_ZIP}" # boost compatible with msvc14
+ MULTI_SET ${boost_multiset_arguments}
+ CHECK_CACHED_VAR Boost_NO_SYSTEM_PATHS BOOL ON DOC "Set to ON to disable searching in locations not specified by these boost cached hint variables"
+ CHECK_CACHED_VAR Boost_NO_BOOST_CMAKE BOOL ON DOC "Set to ON to disable the search for boost-cmake (package cmake config file if boost was built with cmake)"
+ )
+ if(NOT Boost_COMPILER AND Boost_WIN3RDPARTY_USE)
+ message(WARNING "Boost_COMPILER is not set and it's needed.")
+ endif()
+endif()
+
+find_package(Boost 1.65.0 REQUIRED COMPONENTS ${Boost_REQUIRED_COMPONENTS})
+# for CLUSTER
+##find_package(Boost 1.58.0 REQUIRED COMPONENTS ${Boost_REQUIRED_COMPONENTS})
+
+
+if(WIN32)
+ add_compile_options("$<$:/EHsc>")
+ #add_definitions(/EHsc)
+endif()
+
+if(Boost_LIB_DIAGNOSTIC_DEFINITIONS)
+ add_definitions(${Boost_LIB_DIAGNOSTIC_DEFINITIONS})
+endif()
+
+#if(WIN32)
+ add_definitions(-DBOOST_ALL_DYN_LINK -DBOOST_ALL_NO_LIB)
+#endif()
+
+include_directories(${BOOST_INCLUDEDIR} ${Boost_INCLUDE_DIRS})
+link_directories(${BOOST_LIBRARYDIR} ${Boost_LIBRARY_DIRS})
+
+
+##############
+## Find OpenMP
+##############
+find_package(OpenMP)
+
+##############
+## Find OpenCV
+##############
+if (WIN32)
+ if (${MSVC_TOOLSET_VERSION} EQUAL 143)
+ MESSAGE("SPECIAL OPENCV HANDLING")
+ set(opencv_set_arguments
+ CHECK_CACHED_VAR OpenCV_DIR PATH "install" ## see OpenCVConfig.cmake
+ )
+ elseif (MSVC11 OR MSVC12)
+ set(opencv_set_arguments
+ CHECK_CACHED_VAR OpenCV_DIR PATH "opencv/build" ## see OpenCVConfig.cmake
+ )
+ elseif (MSVC14)
+ set(opencv_set_arguments
+ CHECK_CACHED_VAR OpenCV_DIR PATH "opencv-4.5.0/build" ## see OpenCVConfig.cmake
+ )
+ else ()
+ message("There is no provided OpenCV library for your compiler, relying on find_package to find it")
+ endif()
+else()
+ message("There is no provided OpenCV library for your compiler, relying on find_package to find it")
+endif()
+
+sibr_addlibrary(NAME OpenCV #VERBOSE ON
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/sibr/~0.9/opencv.7z"
+ MSVC12 "https://repo-sam.inria.fr/fungraph/dependencies/sibr/~0.9/opencv.7z"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/sibr/~0.9/opencv-4.5.0.7z" # opencv compatible with msvc14 and with contribs
+ MSVC17 "https://repo-sam.inria.fr/fungraph/dependencies/sibr/~0.9/opencv4-8.7z"
+ SET ${opencv_set_arguments}
+ )
+find_package(OpenCV 4.5 REQUIRED) ## Use directly the OpenCVConfig.cmake provided
+## FOR CLUSTER
+###find_package(OpenCV 4.5 REQUIRED PATHS "/data/graphdeco/share/opencv/usr/local/lib64/cmake/opencv4/" ) ## Use directly the OpenCVConfig.cmake provided
+
+ ##https://stackoverflow.com/questions/24262081/cmake-relwithdebinfo-links-to-debug-libs
+set_target_properties(${OpenCV_LIBS} PROPERTIES MAP_IMPORTED_CONFIG_RELWITHDEBINFO RELEASE)
+
+add_definitions(-DOPENCV_TRAITS_ENABLE_DEPRECATED)
+
+if(OpenCV_INCLUDE_DIRS)
+ foreach(inc ${OpenCV_INCLUDE_DIRS})
+ if(NOT EXISTS ${inc})
+ set(OpenCV_INCLUDE_DIR "" CACHE PATH "additional custom include DIR (in case of trouble to find it (fedora 17 opencv package))")
+ endif()
+ endforeach()
+ if(OpenCV_INCLUDE_DIR)
+ list(APPEND OpenCV_INCLUDE_DIRS ${OpenCV_INCLUDE_DIR})
+ include_directories(${OpenCV_INCLUDE_DIRS})
+ endif()
+endif()
+
+###################
+## Find GLFW
+###################
+sibr_addlibrary(
+ NAME glfw3
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/glfw-3.2.1.7z"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/glfw-3.2.1.7z" # TODO SV: provide a valid version if required
+ SET CHECK_CACHED_VAR glfw3_DIR PATH "glfw-3.2.1"
+)
+
+### FOR CLUSTER COMMENT OUT lines above, uncomment lines below
+##find_package(GLFW REQUIRED 3.3 )
+##message("***********=============> GLFW IS " ${GLFW_LIBRARY})
+##message("***********=============> GLFW IS " ${GLFW_LIBRARIES})
+
+find_package(glfw3 REQUIRED)
+
+sibr_gitlibrary(TARGET imgui
+ GIT_REPOSITORY "https://gitlab.inria.fr/sibr/libs/imgui.git"
+ GIT_TAG "741fb3ab6c7e1f7cef23ad0501a06b7c2b354944"
+)
+
+## FOR CLUSTER COMMENT OUT nativefiledialog
+sibr_gitlibrary(TARGET nativefiledialog
+ GIT_REPOSITORY "https://gitlab.inria.fr/sibr/libs/nativefiledialog.git"
+ GIT_TAG "ae2fab73cf44bebdc08d997e307c8df30bb9acec"
+)
+
+
+sibr_gitlibrary(TARGET mrf
+ GIT_REPOSITORY "https://gitlab.inria.fr/sibr/libs/mrf.git"
+ GIT_TAG "30c3c9494a00b6346d72a9e37761824c6f2b7207"
+)
+
+sibr_gitlibrary(TARGET nanoflann
+ GIT_REPOSITORY "https://gitlab.inria.fr/sibr/libs/nanoflann.git"
+ GIT_TAG "7a20a9ac0a1d34850fc3a9e398fc4a7618e8a69a"
+)
+
+sibr_gitlibrary(TARGET picojson
+ GIT_REPOSITORY "https://gitlab.inria.fr/sibr/libs/picojson.git"
+ GIT_TAG "7cf8feee93c8383dddbcb6b64cf40b04e007c49f"
+)
+
+sibr_gitlibrary(TARGET rapidxml
+ GIT_REPOSITORY "https://gitlab.inria.fr/sibr/libs/rapidxml.git"
+ GIT_TAG "069e87f5ec5ce1745253bd64d89644d6b894e516"
+)
+
+sibr_gitlibrary(TARGET xatlas
+ GIT_REPOSITORY "https://gitlab.inria.fr/sibr/libs/xatlas.git"
+ GIT_TAG "0fbe06a5368da13fcdc3ee48d4bdb2919ed2a249"
+ INCLUDE_DIRS "source/xatlas"
+)
+
+Win3rdPartyGlobalCacheAction()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/downloadAndExtractZipFile.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/downloadAndExtractZipFile.cmake
new file mode 100644
index 0000000..7f5fc2b
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/downloadAndExtractZipFile.cmake
@@ -0,0 +1,243 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+## downloadAndExtractZipFile cmake function
+## Provide a way to download zip file from public internet ZIP_URL host
+## and to extract it in a specific EXCTRATED_ZIP_PATH destination.
+## This function use 7-Zip external tool to maximize the compatibles formats.
+## This will be not download again if the EXCTRATED_ZIP_PATH already exist and DL_FORCE is set to OFF.
+## This will try to unzip file if already exist in the ZIP_DL_PATH.
+##
+## If EXCTRATED_ZIP_PATH and/or ZIP_DL_PATH are not full path,
+## it will be interpreted relative to CMAKE_BINARY_DIR
+##
+## Usage example :
+## include(downloadAndExtractZipFile)
+## downloadAndExtractZipFile(
+## http://www.cs.cornell.edu/~snavely/bundler/distr/bundler-v0.4-source.zip
+## ${CMAKE_BINARY_DIR}/Bundler/bundler-v0.4-source.zip
+## ${CMAKE_BINARY_DIR}/Bundler
+## [DL_FORCE ON|OFF]
+## [TIMEOUT]
+## [CHECK_DIRTY_URL]
+## )
+##
+## option DL_FORCE will redownload the zip file [deafult to OFF]
+## option TIMEOUT will end the unzip process after this period of time [default to 600s]
+## option CHECK_DIRTY_URL will write into the given file the downloaded URL and then,
+## next time, if the URL was updated, it detect it with this file
+## and will download the last version. This prevent to alway set manually DL_FORCE to ON...
+##
+if(__downloadAndExtractZipFile_cmake_INCLUDED__)
+ return()
+else()
+ set(__downloadAndExtractZipFile_cmake_INCLUDED__ ON)
+endif()
+
+function(downloadAndExtractZipFile ZIP_URL ZIP_DL_PATH EXCTRATED_ZIP_PATH)
+
+ # message(STATUS "zipUrl=${ZIP_URL} zipDlPath=${ZIP_DL_PATH} extractedZipPath=${EXCTRATED_ZIP_PATH}")
+ cmake_parse_arguments(dwnlezf "" "VERBOSE;DL_FORCE;TIMEOUT;CHECK_DIRTY_URL" "" ${ARGN})
+
+ set(PROGRAMFILESx86 "PROGRAMFILES(x86)")
+
+ ## Check entries mandatory args
+ if(IS_ABSOLUTE "${ZIP_DL_PATH}")
+ else()
+ set(ZIP_DL_PATH "${CMAKE_BINARY_DIR}/${ZIP_DL_PATH}")
+ endif()
+ if(IS_ABSOLUTE "${EXCTRATED_ZIP_PATH}")
+ else()
+ set(EXCTRATED_ZIP_PATH "${CMAKE_BINARY_DIR}/${EXCTRATED_ZIP_PATH}")
+ endif()
+ if(NOT EXISTS "${EXCTRATED_ZIP_PATH}")
+ file(MAKE_DIRECTORY ${EXCTRATED_ZIP_PATH})
+ endif()
+
+ # SB: Once, one of downloaded zip was corrupted by an error message coming from the server.
+ if(EXISTS "${ZIP_DL_PATH}")
+ # So I check for removing such corrupted files
+ message("Removing previous ${ZIP_DL_PATH} (might be corrupted)")
+ file(REMOVE "${ZIP_DL_PATH}")
+ if(EXISTS "${dwnlezf_CHECK_DIRTY_URL}")
+ # and remove the previous (corrupted) made 'Win3rdPartyUrl' file
+ file(REMOVE "${dwnlezf_CHECK_DIRTY_URL}")
+ endif()
+ endif()
+
+ ## Check entries optional args
+ macro(readDirtyUrl )
+ if(dwnlezf_CHECK_DIRTY_URL)
+ if(IS_ABSOLUTE "${dwnlezf_CHECK_DIRTY_URL}")
+ else()
+ set(dwnlezf_CHECK_DIRTY_URL "${CMAKE_BINARY_DIR}/${dwnlezf_CHECK_DIRTY_URL}")
+ endif()
+ get_filename_component(unzipDir ${EXCTRATED_ZIP_PATH} NAME)
+ get_filename_component(unzipPath ${EXCTRATED_ZIP_PATH} PATH)
+ message(STATUS "Checking ${unzipDir} [from ${unzipPath}]...")
+ if(EXISTS "${dwnlezf_CHECK_DIRTY_URL}")
+ get_filename_component(CHECK_DIRTY_URL_FILENAME ${dwnlezf_CHECK_DIRTY_URL} NAME)
+ file(STRINGS "${dwnlezf_CHECK_DIRTY_URL}" contents)
+ list(GET contents 0 downloadURL)
+ list(REMOVE_AT contents 0)
+ if("${downloadURL}" MATCHES "${ZIP_URL}")
+ if(dwnlezf_VERBOSE)
+ message(STATUS "Your downloaded version (URL) seems to be up to date. Let me check if nothing is missing... (see ${dwnlezf_CHECK_DIRTY_URL}).")
+ endif()
+ file(GLOB PATHNAME_PATTERN_LIST "${EXCTRATED_ZIP_PATH}/*") ## is there something inside the downloaded destination ?
+ unset(NAME_PATTERN_LIST)
+ foreach(realPathPattern ${PATHNAME_PATTERN_LIST})
+ get_filename_component(itemName ${realPathPattern} NAME)
+ list(APPEND NAME_PATTERN_LIST ${itemName})
+ endforeach()
+ if(NAME_PATTERN_LIST)
+ foreach(item ${contents})
+ list(FIND NAME_PATTERN_LIST ${item} id)
+ if(${id} MATCHES "-1")
+ message(STATUS "${item} is missing, your downloaded version content changed, need to redownload it.")
+ set(ZIP_DL_FORCE ON)
+ break()
+ else()
+ list(REMOVE_AT NAME_PATTERN_LIST ${id})
+ set(ZIP_DL_FORCE OFF)
+ endif()
+ endforeach()
+ if(NOT ZIP_DL_FORCE AND NAME_PATTERN_LIST)
+ message("Yours seems to be up to date (regarding to ${CHECK_DIRTY_URL_FILENAME})!\nBut there are additional files/folders into your downloaded destination (feel free to clean it if you want).")
+ foreach(item ${NAME_PATTERN_LIST})
+ if(item)
+ message("${item}")
+ endif()
+ endforeach()
+ endif()
+ endif()
+ else()
+ set(ZIP_DL_FORCE ON)
+ message(STATUS "Your downloaded version is dirty (too old).")
+ endif()
+ else()
+ file(GLOB PATHNAME_PATTERN_LIST "${EXCTRATED_ZIP_PATH}/*") ## is there something inside the downloaded destination ?
+ if(NOT PATHNAME_PATTERN_LIST)
+ message("We found nothing into ${EXCTRATED_ZIP_PATH}, we will try to download it for you now.")
+ endif()
+ set(ZIP_DL_FORCE ON)
+ endif()
+ endif()
+ endmacro()
+ readDirtyUrl()
+ if(NOT ZIP_DL_FORCE)
+ return() ## do not need to further (as we are up to date, just exit the function
+ endif()
+
+ macro(writeDirtyUrl )
+ if(dwnlezf_CHECK_DIRTY_URL)
+ file(WRITE "${dwnlezf_CHECK_DIRTY_URL}" "${ZIP_URL}\n")
+ file(GLOB PATHNAME_PATTERN_LIST "${EXCTRATED_ZIP_PATH}/*") ## is there something inside the downloaded destination ?
+ unset(NAME_PATTERN_LIST)
+ foreach(realPathPattern ${PATHNAME_PATTERN_LIST})
+ get_filename_component(itemName ${realPathPattern} NAME)
+ list(APPEND NAME_PATTERN_LIST ${itemName})
+ endforeach()
+ if(NAME_PATTERN_LIST)
+ foreach(item ${NAME_PATTERN_LIST})
+ file(APPEND "${dwnlezf_CHECK_DIRTY_URL}" "${item}\n")
+ endforeach()
+ endif()
+ endif()
+ endmacro()
+
+ if(dwnlezf_DL_FORCE)
+ set(ZIP_DL_FORCE ON)
+ endif()
+
+ if(NOT dwnlezf_TIMEOUT)
+ set(dwnlezf_TIMEOUT 600)
+ endif()
+ math(EXPR dwnlezf_TIMEOUT_MIN "${dwnlezf_TIMEOUT}/60")
+
+ macro(unzip whichZipFile)
+ if(NOT SEVEN_ZIP_CMD)
+ find_program(SEVEN_ZIP_CMD NAMES 7z 7za p7zip DOC "7-zip executable" PATHS "$ENV{PROGRAMFILES}/7-Zip" "$ENV{${PROGRAMFILESx86}}/7-Zip" "$ENV{ProgramW6432}/7-Zip")
+ endif()
+ if(SEVEN_ZIP_CMD)
+ if(dwnlezf_VERBOSE)
+ message(STATUS "UNZIP: please, WAIT UNTIL ${SEVEN_ZIP_CMD} finished...\n(no more than ${dwnlezf_TIMEOUT_MIN} min)")
+ else()
+ message(STATUS "UNZIP...wait...")
+ endif()
+ execute_process( COMMAND ${SEVEN_ZIP_CMD} x ${whichZipFile} -y
+ WORKING_DIRECTORY ${EXCTRATED_ZIP_PATH} TIMEOUT ${dwnlezf_TIMEOUT}
+ RESULT_VARIABLE resVar OUTPUT_VARIABLE outVar ERROR_VARIABLE errVar
+ )
+ if(${resVar} MATCHES "0")
+ if(dwnlezf_VERBOSE)
+ message(STATUS "SUCESS to unzip in ${EXCTRATED_ZIP_PATH}. Now we can remove the downloaded zip file.")
+ endif()
+ execute_process(COMMAND ${CMAKE_COMMAND} -E remove ${whichZipFile})
+ mark_as_advanced(SEVEN_ZIP_CMD)
+ else()
+ message(WARNING "something wrong in ${EXCTRATED_ZIP_PATH}\n with \"${SEVEN_ZIP_CMD} x ${whichZipFile} -y\", redo or try to unzip by yourself...")
+ message("unzip: resVar=${resVar}")
+ message("unzip: outVar=${outVar}")
+ message("unzip: errVar=${errVar}")
+ message("unzip: failed or canceled or timeout")
+ endif()
+ else()
+ message(WARNING "You need 7zip (http://www.7-zip.org/download.html) to unzip the downloaded dir.")
+ set(SEVEN_ZIP_CMD "" CACHE FILEPATH "7-zip executable")
+ mark_as_advanced(CLEAR SEVEN_ZIP_CMD)
+ endif()
+ endmacro()
+
+ if(dwnlezf_VERBOSE)
+ message(STATUS "Trying to look ${ZIP_DL_PATH} if a zip file exist...")
+ endif()
+ if(EXISTS "${ZIP_DL_PATH}")
+
+ ## already downloaded, so just unzip it
+ unzip(${ZIP_DL_PATH})
+ writeDirtyUrl()
+
+ elseif(ZIP_DL_FORCE)
+
+ ## the download part (+ unzip)
+ message(STATUS "Let me try to download package for you : ${ZIP_URL}")
+ if(dwnlezf_VERBOSE)
+ message(STATUS "Downloading...\n SRC=${ZIP_URL}\n DEST=${ZIP_DL_PATH}.tmp\n INACTIVITY_TIMEOUT=180s")
+ endif()
+ file(DOWNLOAD ${ZIP_URL} ${ZIP_DL_PATH}.tmp INACTIVITY_TIMEOUT 360 STATUS status SHOW_PROGRESS)
+
+ list(GET status 0 numResult)
+ if(${numResult} MATCHES "0")
+
+ if(dwnlezf_VERBOSE)
+ message(STATUS "Download succeed, so let me rename the tmp file to unzip it")
+ endif()
+ execute_process(COMMAND ${CMAKE_COMMAND} -E rename ${ZIP_DL_PATH}.tmp ${ZIP_DL_PATH})
+ unzip(${ZIP_DL_PATH})
+ writeDirtyUrl()
+
+ else()
+
+ list(GET status 1 errMsg)
+ message(WARNING "DOWNLOAD ${ZIP_URL} to ${ZIP_DL_PATH} failed\n:${errMsg}")
+ message(WARNING "OK, you need to download the ${ZIP_URL} manually and put it into ${ZIP_DL_PATH}")
+ message("Take a look at the project website page to check available URL.")
+
+ endif()
+
+ endif()
+
+ ## clean up the tmp downloaded file
+ if(EXISTS "${ZIP_DL_PATH}.tmp")
+ execute_process(COMMAND ${CMAKE_COMMAND} -E remove ${ZIP_DL_PATH}.tmp)
+ endif()
+
+endfunction()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/git_describe.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/git_describe.cmake
new file mode 100644
index 0000000..638d70b
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/git_describe.cmake
@@ -0,0 +1,114 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+if(__git_describe_INCLUDED__)
+ return()
+else()
+ set(__git_describe_INCLUDED__ ON)
+endif()
+
+find_package(Git)
+if(Git_FOUND)
+ message(STATUS "Git found: ${GIT_EXECUTABLE}")
+else()
+ message(FATAL_ERROR "Git not found. Aborting")
+endif()
+
+macro(git_describe)
+ cmake_parse_arguments(GIT_DESCRIBE "" "GIT_URL;GIT_BRANCH;GIT_COMMIT_HASH;GIT_TAG;GIT_VERSION;PATH" "" ${ARGN})
+
+ if(NOT GIT_DESCRIBE_PATH)
+ set(GIT_DESCRIBE_PATH ${CMAKE_SOURCE_DIR})
+ endif()
+
+ if(GIT_DESCRIBE_GIT_URL)
+ # Get the current remote
+ execute_process(
+ COMMAND git remote
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE GIT_DESCRIBE_GIT_REMOTE
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+
+ # Get the current remote
+ execute_process(
+ COMMAND git remote get-url ${GIT_DESCRIBE_GIT_REMOTE}
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE ${GIT_DESCRIBE_GIT_URL}
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+ endif()
+
+ if(GIT_DESCRIBE_GIT_BRANCH)
+ # Get the current working branch
+ execute_process(
+ COMMAND git rev-parse --abbrev-ref HEAD
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE ${GIT_DESCRIBE_GIT_BRANCH}
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+ endif()
+
+ if(GIT_DESCRIBE_GIT_COMMIT_HASH)
+ # Get the latest abbreviated commit hash of the working branch
+ execute_process(
+ COMMAND git rev-parse HEAD
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE ${GIT_DESCRIBE_GIT_COMMIT_HASH}
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+ endif()
+
+ if(GIT_DESCRIBE_GIT_TAG)
+ # Get the tag
+ execute_process(
+ COMMAND git describe --tags --exact-match
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE ${GIT_DESCRIBE_GIT_TAG}
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+ endif()
+
+ if(GIT_DESCRIBE_GIT_VERSION)
+ # Get the version from git describe
+ execute_process(
+ COMMAND git describe
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE ${GIT_DESCRIBE_GIT_VERSION}
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+
+ if(${GIT_DESCRIBE_GIT_VERSION} STREQUAL "")
+ execute_process(
+ COMMAND git rev-parse --abbrev-ref HEAD
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE GIT_DESCRIBE_GIT_VERSION_BRANCH
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+ execute_process(
+ COMMAND git log -1 --format=%h
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE GIT_DESCRIBE_GIT_VERSION_COMMIT
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+
+ set(${GIT_DESCRIBE_GIT_VERSION} "${GIT_DESCRIBE_GIT_VERSION_BRANCH}-${GIT_DESCRIBE_GIT_VERSION_COMMIT}")
+ endif()
+ endif()
+
+endmacro()
\ No newline at end of file
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/include_once.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/include_once.cmake
new file mode 100644
index 0000000..d28b39c
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/include_once.cmake
@@ -0,0 +1,22 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+macro(include_once file)
+ get_filename_component(INCLUDE_ONCE_FILEPATH ${file} REALPATH)
+ string(REGEX REPLACE "(\\.|\\/+|\\:|\\\\+)" "_" INCLUDE_ONCE_FILEPATH ${INCLUDE_ONCE_FILEPATH})
+ get_property(INCLUDED_${INCLUDE_ONCE_FILEPATH}_LOCAL GLOBAL PROPERTY INCLUDED_${INCLUDE_ONCE_FILEPATH})
+ if (INCLUDED_${INCLUDE_ONCE_FILEPATH}_LOCAL)
+ return()
+ else()
+ set_property(GLOBAL PROPERTY INCLUDED_${INCLUDE_ONCE_FILEPATH} true)
+
+ include(${file})
+ endif()
+endmacro()
\ No newline at end of file
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/install_runtime.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/install_runtime.cmake
new file mode 100644
index 0000000..695c4b3
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/install_runtime.cmake
@@ -0,0 +1,887 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+## This file is mainly used to allow runtime installation
+## There are some utilities cmake functions to ease the generic deployement (abstract common usage of cmake)...
+##
+## You cannot run your programm automaticaly from your CNAKE_BINARY_DIR when you build
+## as it will miss all dependencies and ressources files...
+## You have to run install target in order to test your programm.
+##
+## The only one function/macros you may use inside your sub-CMakeLists.txt (sub-project) is :
+## ******************
+## ibr_install_target macro => see documentation at the end of this file
+## ******************
+## It use these utilities cmake functions to abstract the installation in an uniform way for all sub-projects.
+##
+if(__install_runtime_cmake_INCLUDED__)
+ return()
+else()
+ set(__install_runtime_cmake_INCLUDED__ ON)
+endif()
+
+
+##
+## Allow to write a resource config file which contain additional ressource paths
+## (used by IBR_Common Resource system to load shaders and potentialy images, plugins and so on)
+##
+## ADD option list all the paths to add in the file (relative paths are interpreted relative to working dir of the executable)
+## INSTALL option to specify where we want to install this file
+##
+## Example usage:
+## resourceFile(ADD "shaders" "${PROJECT_NAME}_rsc" INSTALL bin)
+##
+macro(resourceFile)
+ cmake_parse_arguments(rsc "" "INSTALL;FILE_PATH;CONFIG_TYPE" "ADD" ${ARGN}) ## both args are directory path
+
+ if(rsc_ADD)
+ unset(IBR_RSC_FILE_CONTENT_LIST)
+ if(EXISTS "${rsc_FILE_PATH}")
+ file(READ "${rsc_FILE_PATH}" IBR_RSC_FILE_CONTENT)
+ string(REGEX REPLACE "\n" ";" IBR_RSC_FILE_CONTENT_LIST "${IBR_RSC_FILE_CONTENT}")
+ endif()
+ list(APPEND IBR_RSC_FILE_CONTENT_LIST "${rsc_ADD}")
+ list(REMOVE_DUPLICATES IBR_RSC_FILE_CONTENT_LIST)
+ file(WRITE "${rsc_FILE_PATH}" "")
+ foreach(rscDir ${IBR_RSC_FILE_CONTENT_LIST})
+ file(APPEND "${rsc_FILE_PATH}" "${rscDir}\n")
+ endforeach()
+ unset(rsc_ADD)
+ endif()
+
+ if(rsc_INSTALL)
+ install(FILES ${rsc_FILE_PATH} CONFIGURATIONS ${rsc_CONFIG_TYPE} DESTINATION ${rsc_INSTALL})
+ unset(rsc_INSTALL)
+ endif()
+endmacro()
+
+
+##
+## Install *.pdb generated file for the current cmake project
+## assuming the output target name is the cmake project name.
+## This macro is useful for crossplateform multi config mode.
+##
+## Usage Example:
+##
+## if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+## installPDB(${PROJECT_NAME} ${CMAKE_BUILD_TYPE} RUNTIME_DEST bin ARCHIVE_DEST lib LIBRARY_DEST lib)
+## endif()
+## foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+## installPDB(${PROJECT_NAME} ${CONFIG_TYPES} RUNTIME_DEST bin ARCHIVE_DEST lib LIBRARY_DEST lib)
+## endforeach()
+##
+macro(installPDB targetName configType)
+ cmake_parse_arguments(instpdb "" "COMPONENT" "ARCHIVE_DEST;LIBRARY_DEST;RUNTIME_DEST" ${ARGN}) ## both args are directory path
+
+ if(NOT MSVC)
+ return()
+ endif()
+
+ ## Check if DESTINATION are provided according to the TYPE of the given target (see install command doc to see correspodances)
+ get_target_property(type ${targetName} TYPE)
+ if(${type} MATCHES "EXECUTABLE" AND instpdb_RUNTIME_DEST)
+ set(pdb_DESTINATION ${instpdb_RUNTIME_DEST})
+ elseif(${type} MATCHES "STATIC_LIBRARY" AND instpdb_ARCHIVE_DEST)
+ set(pdb_DESTINATION ${instpdb_ARCHIVE_DEST})
+ elseif(${type} MATCHES "MODULE_LIBRARY" AND instpdb_LIBRARY_DEST)
+ set(pdb_DESTINATION ${instpdb_LIBRARY_DEST})
+ elseif(${type} MATCHES "SHARED_LIBRARY")
+ if(WIN32 AND instpdb_RUNTIME_DEST)
+ set(pdb_DESTINATION ${instpdb_RUNTIME_DEST})
+ else()
+ set(pdb_DESTINATION ${instpdb_LIBRARY_DEST})
+ endif()
+ endif()
+
+ if(NOT pdb_DESTINATION)
+ set(pdb_DESTINATION bin) ## default destination of the pdb file
+ endif()
+
+ if(NOT instpdb_COMPONENT)
+ set(instpdb_COMPONENT )
+ else()
+ set(instpdb_COMPONENT COMPONENT ${instpdb_COMPONENT})
+ endif()
+
+ string(TOUPPER ${configType} CONFIG_TYPES_UC)
+ get_target_property(PDB_PATH ${targetName} PDB_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC})
+
+ get_target_property(confModePostfix ${targetName} ${CONFIG_TYPES_UC}_POSTFIX)
+ if(NOT confModePostfix)
+ set(confModePostfix "")
+ endif()
+ set_target_properties(${targetName} PROPERTIES PDB_NAME_${CONFIG_TYPES_UC} ${targetName}${confModePostfix})
+ get_target_property(PDB_NAME ${targetName} PDB_NAME_${CONFIG_TYPES_UC})# if not set, this is empty
+
+ if(EXISTS "${PDB_PATH}/${PDB_NAME}.pdb")
+ install(FILES "${PDB_PATH}/${PDB_NAME}.pdb" CONFIGURATIONS ${configType} DESTINATION ${pdb_DESTINATION} ${instpdb_COMPONENT} OPTIONAL)
+ endif()
+endmacro()
+
+
+##
+## Add additional target to install a project independently and based on its component
+## configMode is used to prevent default Release installation (we want also to install in other build/config type)
+##
+macro(installTargetProject targetOfProject targetOfInstallProject)
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ set(configMode ${CMAKE_BUILD_TYPE})
+ elseif(MSVC)
+ ## $(Configuration) will be one of the following : Debug, Release, MinSizeRel, RelWithDebInfo
+ set(configMode $(Configuration))
+ endif()
+ if(configMode)
+ get_target_property(srcFiles ${targetOfProject} SOURCES)
+ add_custom_target( ${targetOfInstallProject} #ALL
+ ${CMAKE_COMMAND} -DBUILD_TYPE=${configMode} -DCOMPONENT=${targetOfInstallProject} -P ${CMAKE_BINARY_DIR}/cmake_install.cmake
+ DEPENDS ${srcFiles}
+ COMMENT "run the installation only for ${targetOfProject}" VERBATIM
+ )
+ add_dependencies(${targetOfInstallProject} ${targetOfProject})
+
+ get_target_property(INSTALL_BUILD_FOLDER ${targetOfProject} FOLDER)
+ set_target_properties(${targetOfInstallProject} PROPERTIES FOLDER ${INSTALL_BUILD_FOLDER})
+ endif()
+endmacro()
+
+# Collect all currently added targets in all subdirectories
+#
+# Parameters:
+# - _result the list containing all found targets
+# - _dir root directory to start looking from
+function(get_all_targets _result _dir)
+ get_property(_subdirs DIRECTORY "${_dir}" PROPERTY SUBDIRECTORIES)
+ foreach(_subdir IN LISTS _subdirs)
+ get_all_targets(${_result} "${_subdir}")
+ endforeach()
+
+ get_directory_property(_sub_targets DIRECTORY "${_dir}" BUILDSYSTEM_TARGETS)
+ set(${_result} ${${_result}} ${_sub_targets} PARENT_SCOPE)
+endfunction()
+
+##
+## Add targets for building and installing subdirectories
+macro(subdirectory_target target directory build_folder)
+ add_custom_target(${target}
+ COMMENT "run build for all projects in this directory" VERBATIM
+ )
+ get_all_targets(ALL_TARGETS ${directory})
+ add_dependencies(${target} ${ALL_TARGETS})
+ add_custom_target(${target}_install
+ ${CMAKE_COMMAND} -DBUILD_TYPE=$ -DCOMPONENT=${target}_install -P ${CMAKE_BINARY_DIR}/cmake_install.cmake
+ COMMENT "run install for all projects in this directory" VERBATIM
+ )
+ add_dependencies(${target}_install ${target})
+
+ set_target_properties(${target} PROPERTIES FOLDER ${build_folder})
+ set_target_properties(${target}_install PROPERTIES FOLDER ${build_folder})
+endmacro()
+
+
+## CMAKE install all required dependencies for an application (included system OS files like msvc*.dll for example)
+##
+## install_runtime(
+## [TARGET name]
+## [PLUGINS name [nameN ...] [PLUGIN_PATH_NAME currentPathName [FROM_REL_PATH matchDirFromCurrentPathName] [PLUGIN_PATH_DEST installDir] ]
+## [PLUGINS ...]
+## [DIRS path [pathN ...] ]
+## [TARGET_LIBRARIES filePath [filePathN ...] ]
+## [TARGET_PACKAGES packageName [packageNameN ...] ]
+## [COMPONENT installComponentName]
+## [PLAUSIBLES_POSTFIX Debug_postfix [MinSizeRel_postfix relWithDebInfo_postfix ...] ]
+## [VERBOSE]
+## )
+##
+## installedFilePathTargetAppToResolve : the final installed targetApp absolute full file path name you want to resolve
+##
+## TARGET : The target app we want to install. If given, it's used to look for link libraries paths (best choice to use, strongly advised to use it)
+##
+## PLUGINS : Some application built use/load some plugins which can't be detect inside its binary,
+## so, here you can specify which plugins the application use/load in order to install them
+## and resolve also there dependencies.
+## With PLUGINS multi FLAGS :
+## PLUGIN_PATH_NAME : The current plugin full file path we want to install
+## FROM_REL_PATH : [optional: default only the file is kept] From which matching dir of the plugin path we want to install (keep the directories structure)
+## PLUGIN_PATH_DEST : [optional: default relative to executable directory] Where (full path to the install directory) we will install the plugin file (or file path)
+##
+## DIRS : A list of directories to looking for dependencies
+## TARGET_LIBRARIES : DEPRECATED (use TARGET flag instead) : The cmake content variables used for the target_link_libraries( ...)
+## TARGET_PACKAGES : DEPRECATED (use TARGET flag instead) : The cmake package names used for the findPackage(...) for your targetApp
+## ADVICE: This flag add entries in cache (like: _DIR), it could be useful to fill these variable!
+## COMPONENT : (default to runtime) Is the component name associated to the installation
+## It is used when you want to install separatly some part of your projets (see install cmake doc)
+## VERBOSE : For debug or to get more informations in the output console
+##
+## Usage:
+## install_runtime(${CMAKE_INSTALL_PREFIX}/${EXECUTABLE_NAME}${CMAKE_EXECUTABLE_SUFFIX}
+## VERBOSE
+## TARGET ${PROJECT_NAME}
+## PLAUSIBLES_POSTFIX _d
+## PLUGINS
+## PLUGIN_PATH_NAME ${PLUGIN_PATH_NAME}${CMAKE_SHARED_MODULE_SUFFIX} ## will be installed (default exec path if no PLUGINS_DEST) and then will be resolved
+## FROM_REL_PATH plugins ## optional, used especially for keeping qt plugins tree structure
+## PLUGIN_PATH_DEST ${CMAKE_INSTALL_PREFIX}/plugins ## (or relative path 'plugins' will be interpreted relative to installed executable)
+## DIRS ${CMAKE_CURRENT_BINARY_DIR} ${CMAKE_BINARY_DIR}
+## TARGET_LIBRARIES ${OPENGL_LIBRARIES} ## DEPRECATED (use TARGET flag instead)
+## ${GLEW_LIBRARIES}
+## ${GLUT_LIBRARIES}
+## ${Boost_LIBRARIES}
+## ${SuiteSparse_LIBRARIES}
+## ${CGAL_LIBRARIES}
+## TARGET_PACKAGES OPENGL ## DEPRECATED (use TARGET flag instead)
+## GLEW
+## GLUT
+## CGAL
+## Boost
+## SuiteSparse
+## )
+##
+## For plugins part, it use our internal parse_arguments_multi.cmake
+##
+function(install_runtime installedFilePathTargetAppToResolve)
+ set(optionsArgs "VERBOSE")
+ set(oneValueArgs "COMPONENT;INSTALL_FOLDER;CONFIG_TYPE")
+ set(multiValueArgs "DIRS;PLUGINS;TARGET_LIBRARIES;TARGET_PACKAGES;TARGET;PLAUSIBLES_POSTFIX")
+ cmake_parse_arguments(inst_run "${optionsArgs}" "${oneValueArgs}" "${multiValueArgs}" ${ARGN} )
+
+ if(IS_ABSOLUTE ${installedFilePathTargetAppToResolve})
+ else()
+ set(installedFilePathTargetAppToResolve ${inst_run_INSTALL_FOLDER}/${installedFilePathTargetAppToResolve})
+ endif()
+
+ get_filename_component(EXEC_NAME ${installedFilePathTargetAppToResolve} NAME_WE)
+ get_filename_component(EXEC_PATH ${installedFilePathTargetAppToResolve} PATH)
+
+ if(NOT inst_run_COMPONENT)
+ set(inst_run_COMPONENT runtime)
+ endif()
+
+
+ ## Try to append as more possible as possible paths to find dependencies (deprecated since we can use target_properties to get back paths)
+ set(libPaths )
+ foreach(libraryFileName ${inst_run_TARGET_LIBRARIES})
+ if(IS_DIRECTORY "${libraryFileName}")
+ list(APPEND libPaths "${libraryFileName}")
+ else()
+ get_filename_component(libpath "${libraryFileName}" PATH)
+ if(EXISTS "${libpath}")
+ list(APPEND libPaths "${libpath}")
+ endif()
+ endif()
+ endforeach()
+
+ ## This macro is used internaly here to recursilvely get path of LINK_LIBRARIES of each non imported target
+ ## Typically if you have 2 internal dependencies between cmake targets, we want cmake to be able to get back path where are these dependencies
+ macro(recurseDepList target)
+ get_target_property(linkLibs ${target} LINK_LIBRARIES)
+ foreach(lib ${linkLibs})
+ string(FIND ${lib} ">" strId) ## cmake is using generator-expression?
+ if(TARGET ${lib})
+ ## Skipping interface libraries as they're system ones
+ get_target_property(type ${lib} TYPE)
+ get_target_property(imported ${lib} IMPORTED)
+ if(type STREQUAL "INTERFACE_LIBRARY")
+ get_target_property(imp_loc ${lib} INTERFACE_IMPORTED_LOCATION)
+ if(imp_loc)
+ get_filename_component(imp_loc ${imp_loc} PATH)
+ list(APPEND targetLibPath ${imp_loc})
+ endif()
+ get_target_property(loc ${lib} INTERFACE_LOCATION)
+ if(loc)
+ get_filename_component(loc ${loc} PATH)
+ list(APPEND targetLibPath ${loc})
+ endif()
+ ## it's not a path but a single target name
+ ## for build-target which are part of the current cmake configuration : nothing to do as cmake already know the output path
+ ## for imported target, we need to look for theire imported location
+ elseif(imported)
+ get_target_property(imp_loc ${lib} IMPORTED_LOCATION)
+ if(imp_loc)
+ get_filename_component(imp_loc ${imp_loc} PATH)
+ list(APPEND targetLibPath ${imp_loc})
+ endif()
+ get_target_property(loc ${lib} LOCATION)
+ if(loc)
+ get_filename_component(loc ${loc} PATH)
+ list(APPEND targetLibPath ${loc})
+ endif()
+ else()
+ recurseDepList(${lib})
+ endif()
+ elseif(NOT ${strId} MATCHES -1) ## mean cmake use generator-expression (CMAKE VERSION > 3.0)
+ string(REGEX MATCH ">:[@A-Za-z_:/.0-9-]+" targetLibPath ${lib})
+ string(REGEX REPLACE ">:([@A-Za-z_:/.0-9-]+)" "\\1" targetLibPath ${targetLibPath})
+ get_filename_component(targetLibPath ${targetLibPath} PATH)
+ elseif(EXISTS ${lib})
+ set(targetLibPath ${lib})
+ get_filename_component(targetLibPath ${targetLibPath} PATH)
+ else()
+ #message(STATUS "[install_runtime] skip link library : ${lib} , of target ${target}")
+ endif()
+ if(targetLibPath)
+ list(APPEND targetLinkLibsPathList ${targetLibPath})
+ endif()
+ endforeach()
+ if(targetLinkLibsPathList)
+ list(REMOVE_DUPLICATES targetLinkLibsPathList)
+ endif()
+ endmacro()
+ if(inst_run_TARGET)
+ recurseDepList(${inst_run_TARGET})
+ if(targetLinkLibsPathList)
+ list(APPEND libPaths ${targetLinkLibsPathList})
+ endif()
+ endif()
+
+ if(libPaths)
+ list(REMOVE_DUPLICATES libPaths)
+ foreach(libPath ${libPaths})
+ get_filename_component(path ${libPath} PATH)
+ list(APPEND libPaths ${path})
+ endforeach()
+ endif()
+
+
+ ## possible speciale dir(s) according to the build system and OS
+ if(CMAKE_SIZEOF_VOID_P EQUAL 8)
+ set(BUILD_TYPES_FOR_DLL "x64")
+ if(WIN32)
+ list(APPEND BUILD_TYPES_FOR_DLL "Win64")
+ endif()
+ else()
+ set(BUILD_TYPES_FOR_DLL "x86")
+ if(WIN32)
+ list(APPEND BUILD_TYPES_FOR_DLL "Win32")
+ endif()
+ endif()
+
+
+ ## Try to append as more as possible paths to find dependencies (here, mainly for *.dll)
+ foreach(dir ${inst_run_DIRS} ${libPaths})
+ if(EXISTS "${dir}/bin")
+ list(APPEND inst_run_DIRS "${dir}/bin")
+ elseif(EXISTS "${dir}")
+ list(APPEND inst_run_DIRS "${dir}")
+ endif()
+ endforeach()
+ list(REMOVE_DUPLICATES inst_run_DIRS)
+ foreach(dir ${inst_run_DIRS})
+ if(EXISTS "${dir}")
+ list(APPEND argDirs ${dir})
+ foreach(BUILD_TYPE_FOR_DLL ${BUILD_TYPES_FOR_DLL})
+ if(EXISTS "${dir}/${BUILD_TYPE_FOR_DLL}")
+ list(APPEND argDirs "${dir}/${BUILD_TYPE_FOR_DLL}")
+ endif()
+ foreach(OUTPUTCONFIG ${CMAKE_CONFIGURATION_TYPES}) ## for windows multi-generator (MSVC)
+ if(EXISTS "${dir}/${BUILD_TYPE_FOR_DLL}/${OUTPUTCONFIG}")
+ list(APPEND argDirs "${dir}/${BUILD_TYPE_FOR_DLL}/${OUTPUTCONFIG}")
+ endif()
+ endforeach()
+ if(CMAKE_BUILD_TYPE) ## for single generator (makefiles)
+ if(EXISTS "${dir}/${BUILD_TYPE_FOR_DLL}/${CMAKE_BUILD_TYPE}")
+ list(APPEND argDirs "${dir}/${BUILD_TYPE_FOR_DLL}/${CMAKE_BUILD_TYPE}")
+ endif()
+ endif()
+ endforeach()
+ foreach(OUTPUTCONFIG ${CMAKE_CONFIGURATION_TYPES}) ## for windows multi-generator (MSVC)
+ if(EXISTS "${dir}/${OUTPUTCONFIG}")
+ list(APPEND argDirs "${dir}/${OUTPUTCONFIG}")
+ endif()
+ foreach(BUILD_TYPE_FOR_DLL ${BUILD_TYPES_FOR_DLL})
+ if(EXISTS "${dir}/${OUTPUTCONFIG}/${BUILD_TYPE_FOR_DLL}")
+ list(APPEND argDirs "${dir}/${OUTPUTCONFIG}/${BUILD_TYPE_FOR_DLL}")
+ endif()
+ endforeach()
+ endforeach()
+ if(CMAKE_BUILD_TYPE) ## for single generator (makefiles)
+ if(EXISTS "${dir}/${CMAKE_BUILD_TYPE}")
+ list(APPEND argDirs "${dir}/${CMAKE_BUILD_TYPE}")
+ endif()
+ foreach(BUILD_TYPE_FOR_DLL ${BUILD_TYPES_FOR_DLL})
+ if(EXISTS "${dir}/${CMAKE_BUILD_TYPE}/${BUILD_TYPE_FOR_DLL}")
+ list(APPEND argDirs "${dir}/${CMAKE_BUILD_TYPE}/${BUILD_TYPE_FOR_DLL}")
+ endif()
+ endforeach()
+ endif()
+ endif()
+ endforeach()
+ if(argDirs)
+ list(REMOVE_DUPLICATES argDirs)
+ endif()
+
+
+ ## Try to append as more possible paths to find dependencies (here, mainly for *.dll)
+ foreach(packageName ${inst_run_TARGET_PACKAGES})
+ if(EXISTS "${${packageName}_DIR}")
+ list(APPEND packageDirs ${${packageName}_DIR})
+ list(APPEND packageDirs ${${packageName}_DIR}/bin)
+ foreach(BUILD_TYPE_FOR_DLL ${BUILD_TYPES_FOR_DLL})
+ if(EXISTS "${${packageName}_DIR}/bin/${BUILD_TYPE_FOR_DLL}")
+ list(APPEND packageDirs "${${packageName}_DIR}/bin/${BUILD_TYPE_FOR_DLL}")
+ endif()
+ foreach(OUTPUTCONFIG ${CMAKE_CONFIGURATION_TYPES}) ## for windows multi-generator (MSVC)
+ if(EXISTS "${${packageName}_DIR}/bin/${BUILD_TYPE_FOR_DLL}/${OUTPUTCONFIG}")
+ list(APPEND packageDirs "${${packageName}_DIR}/bin/${BUILD_TYPE_FOR_DLL}/${OUTPUTCONFIG}")
+ endif()
+ endforeach()
+ if(CMAKE_BUILD_TYPE) ## for single generator (makefiles)
+ if(EXISTS "${${packageName}_DIR}/bin/${BUILD_TYPE_FOR_DLL}/${CMAKE_BUILD_TYPE}")
+ list(APPEND packageDirs "${${packageName}_DIR}/bin/${BUILD_TYPE_FOR_DLL}/${CMAKE_BUILD_TYPE}")
+ endif()
+ endif()
+ endforeach()
+ foreach(OUTPUTCONFIG ${CMAKE_CONFIGURATION_TYPES}) ## for windows multi-generator (MSVC)
+ if(EXISTS "${${packageName}_DIR}/bin/${OUTPUTCONFIG}")
+ list(APPEND packageDirs "${${packageName}_DIR}/bin/${OUTPUTCONFIG}")
+ endif()
+ foreach(BUILD_TYPE_FOR_DLL ${BUILD_TYPES_FOR_DLL})
+ if(EXISTS "${${packageName}_DIR}/bin/${OUTPUTCONFIG}/${BUILD_TYPE_FOR_DLL}")
+ list(APPEND packageDirs "${${packageName}_DIR}/bin/${OUTPUTCONFIG}/${BUILD_TYPE_FOR_DLL}")
+ endif()
+ endforeach()
+ endforeach()
+ if(CMAKE_BUILD_TYPE) ## for single generator (makefiles)
+ if(EXISTS "${${packageName}_DIR}/bin/${CMAKE_BUILD_TYPE}")
+ list(APPEND packageDirs "${${packageName}_DIR}/bin/${CMAKE_BUILD_TYPE}")
+ endif()
+ foreach(BUILD_TYPE_FOR_DLL ${BUILD_TYPES_FOR_DLL})
+ if(EXISTS "${${packageName}_DIR}/bin/${CMAKE_BUILD_TYPE}/${BUILD_TYPE_FOR_DLL}")
+ list(APPEND packageDirs "${${packageName}_DIR}/bin/${CMAKE_BUILD_TYPE}/${BUILD_TYPE_FOR_DLL}")
+ endif()
+ endforeach()
+ endif()
+ else()
+ set(${packageName}_DIR "$ENV{${packageName}_DIR}" CACHE PATH "${packageName}_DIR root directory for looking for dirs containning *.dll")
+ endif()
+ endforeach()
+ if(packageDirs)
+ list(REMOVE_DUPLICATES packageDirs)
+ endif()
+
+
+ set(dirsToLookFor "${EXEC_PATH}")
+ if(packageDirs)
+ list(APPEND dirsToLookFor ${packageDirs})
+ endif()
+ if(argDirs)
+ list(APPEND dirsToLookFor ${argDirs})
+ endif()
+ get_property(used_LINK_DIRECTORIES DIRECTORY PROPERTY LINK_DIRECTORIES)
+ if (used_LINK_DIRECTORIES)
+ list(APPEND dirsToLookFor ${used_LINK_DIRECTORIES})
+ list(REMOVE_DUPLICATES dirsToLookFor)
+ endif()
+
+
+ ## handle plugins
+ set(pluginsList "")
+ include(parse_arguments_multi) ## this function will process recursively items of the sub-list [default print messages]
+ function(parse_arguments_multi_function results)
+ cmake_parse_arguments(pamf "VERBOSE" "PLUGIN_PATH_DEST;FROM_REL_PATH;EXEC_PATH;COMPONENT" "" ${ARGN}) ## EXEC_PATH and COMPONENT are for exclusive internal use
+ list(REMOVE_DUPLICATES pamf_UNPARSED_ARGUMENTS)
+ foreach(PLUGIN_PATH_NAME ${pamf_UNPARSED_ARGUMENTS})
+ if(EXISTS ${PLUGIN_PATH_NAME})
+ if(IS_DIRECTORY ${PLUGIN_PATH_NAME})
+ if(pamf_VERBOSE)
+ message(WARNING "${PLUGIN_PATH_NAME} IS_DIRECTORY, cannot installed a directory, please give a path filename")
+ endif()
+ else()
+ if(NOT pamf_PLUGIN_PATH_DEST)
+ set(PLUGIN_PATH_DEST ${pamf_EXEC_PATH}) ## the default dest value
+ else()
+ set(PLUGIN_PATH_DEST ${pamf_PLUGIN_PATH_DEST})
+ endif()
+
+ if(pamf_FROM_REL_PATH)
+ file(TO_CMAKE_PATH ${PLUGIN_PATH_NAME} PLUGIN_PATH_NAME)
+ get_filename_component(PLUGIN_PATH ${PLUGIN_PATH_NAME} PATH)
+ unset(PLUGIN_PATH_LIST)
+ unset(PLUGIN_PATH_LIST_COUNT)
+ unset(PLUGIN_REL_PATH_LIST)
+ unset(PLUGIN_REL_PATH)
+ string(REPLACE "/" ";" PLUGIN_PATH_LIST ${PLUGIN_PATH}) ## create a list of dir
+ list(FIND PLUGIN_PATH_LIST ${pamf_FROM_REL_PATH} id)
+ list(LENGTH PLUGIN_PATH_LIST PLUGIN_PATH_LIST_COUNT)
+ if(${id} GREATER 0)
+ math(EXPR id "${id}+1") ## matches relative path not include
+ math(EXPR PLUGIN_PATH_LIST_COUNT "${PLUGIN_PATH_LIST_COUNT}-1") ## the end of the list
+ foreach(i RANGE ${id} ${PLUGIN_PATH_LIST_COUNT})
+ list(GET PLUGIN_PATH_LIST ${i} out)
+ list(APPEND PLUGIN_REL_PATH_LIST ${out})
+ endforeach()
+ foreach(dir ${PLUGIN_REL_PATH_LIST})
+ set(PLUGIN_REL_PATH "${PLUGIN_REL_PATH}/${dir}")
+ endforeach()
+ endif()
+ set(PLUGIN_PATH_DEST ${PLUGIN_PATH_DEST}${PLUGIN_REL_PATH})
+ endif()
+
+ install(FILES ${PLUGIN_PATH_NAME} CONFIGURATIONS ${inst_run_CONFIG_TYPE} DESTINATION ${PLUGIN_PATH_DEST} COMPONENT ${pamf_COMPONENT})
+ get_filename_component(pluginName ${PLUGIN_PATH_NAME} NAME)
+ if(IS_ABSOLUTE ${PLUGIN_PATH_DEST})
+ else()
+ set(PLUGIN_PATH_DEST ${inst_run_INSTALL_FOLDER}/${PLUGIN_PATH_DEST})
+ endif()
+ list(APPEND pluginsList ${PLUGIN_PATH_DEST}/${pluginName})
+ endif()
+ else()
+ message(WARNING "You need to provide a valid PLUGIN_PATH_NAME")
+ set(pluginsList )
+ endif()
+ endforeach()
+ set(${results} ${pluginsList} PARENT_SCOPE)
+ endfunction()
+
+ if(inst_run_VERBOSE)
+ list(APPEND extra_flags_to_add VERBOSE)
+ endif()
+ list(APPEND extra_flags_to_add EXEC_PATH ${EXEC_PATH} COMPONENT ${inst_run_COMPONENT}) ## for internal use inside overloaded function
+ list(LENGTH inst_run_PLUGINS inst_run_PLUGINS_count)
+ if(${inst_run_PLUGINS_count} GREATER 0)
+ parse_arguments_multi(PLUGIN_PATH_NAME inst_run_PLUGINS ${inst_run_PLUGINS} ## see internal overload parse_arguments_multi_function for processing each sub-list
+ NEED_RESULTS ${inst_run_PLUGINS_count} ## this is used to check when we are in the first loop (in order to reset parse_arguments_multi_results)
+ EXTRAS_FLAGS ${extra_flags_to_add} ## this is used to allow catching additional internal flags of our overloaded function
+ )
+ endif()
+
+ #message(parse_arguments_multi_results = ${parse_arguments_multi_results})
+ list(APPEND pluginsList ${parse_arguments_multi_results})
+
+
+
+ ## Install rules for required system runtimes such as MSVCRxx.dll
+ set(CMAKE_INSTALL_SYSTEM_RUNTIME_LIBS_SKIP ON)
+ include(InstallRequiredSystemLibraries)
+ if(CMAKE_INSTALL_SYSTEM_RUNTIME_LIBS)
+ install(FILES ${CMAKE_INSTALL_SYSTEM_RUNTIME_LIBS}
+ CONFIGURATIONS ${inst_run_CONFIG_TYPE}
+ DESTINATION ${EXEC_PATH}
+ COMPONENT ${inst_run_COMPONENT}
+ )
+ endif()
+
+ ## print what we are doing to do
+ if(inst_run_VERBOSE)
+ message(STATUS "[install_runtime] On install target call, cmake will try to resolve dependencies for given app:\n ${installedFilePathTargetAppToResolve} (with plausible postfix: ${inst_run_PLAUSIBLES_POSTFIX})")
+ if(pluginsList)
+ message(STATUS " and also for plugins :")
+ foreach(plugin ${pluginsList})
+ message(STATUS " ${plugin}")
+ endforeach()
+ endif()
+ message(STATUS " Looking for dependencies into:")
+ foreach(dir ${dirsToLookFor})
+ message(STATUS " ${dir}")
+ endforeach()
+ endif()
+
+ ## Install rules for required dependencies libs/plugins for the target app
+ ## will resolve all installed target files with config modes postfixes
+ string(TOUPPER ${inst_run_CONFIG_TYPE} inst_run_CONFIG_TYPE_UC)
+ get_target_property(postfix ${inst_run_TARGET} "${inst_run_CONFIG_TYPE_UC}_POSTFIX")
+ install(CODE "set(target \"${inst_run_TARGET}\")" COMPONENT ${inst_run_COMPONENT} CONFIGURATIONS ${CONFIG_TYPE})
+ install(CODE "set(inst_run_CONFIG_TYPE \"${inst_run_CONFIG_TYPE}\")" COMPONENT ${inst_run_COMPONENT} CONFIGURATIONS ${CONFIG_TYPE})
+ install(CODE "set(inst_run_INSTALL_FOLDER \"${inst_run_INSTALL_FOLDER}\")" COMPONENT ${inst_run_COMPONENT} CONFIGURATIONS ${CONFIG_TYPE})
+ install(CODE "set(app \"${EXEC_PATH}/${EXEC_NAME}${postfix}${CMAKE_EXECUTABLE_SUFFIX}\")" COMPONENT ${inst_run_COMPONENT} CONFIGURATIONS ${CONFIG_TYPE})
+ install(CODE "set(dirsToLookFor \"${dirsToLookFor}\")" COMPONENT ${inst_run_COMPONENT} CONFIGURATIONS ${CONFIG_TYPE})
+ install(CODE
+ [[
+ if("${CMAKE_INSTALL_CONFIG_NAME}" STREQUAL "${inst_run_CONFIG_TYPE}")
+ message(STATUS "Installing ${target} dependencies...")
+
+ file(GET_RUNTIME_DEPENDENCIES
+ EXECUTABLES ${app}
+ RESOLVED_DEPENDENCIES_VAR _r_deps
+ UNRESOLVED_DEPENDENCIES_VAR _u_deps
+ CONFLICTING_DEPENDENCIES_PREFIX _c_deps
+ DIRECTORIES ${dirsToLookFor}
+ PRE_EXCLUDE_REGEXES "api-ms-*"
+ POST_EXCLUDE_REGEXES ".*system32/.*\\.dll" ".*SysWOW64/.*\\.dll"
+ )
+
+ if(_u_deps)
+ message(WARNING "There were unresolved dependencies for executable ${EXEC_FILE}: \"${_u_deps}\"!")
+ endif()
+ if(_c_deps_FILENAMES)
+ message(WARNING "There were conflicting dependencies for executable ${EXEC_FILE}: \"${_c_deps_FILENAMES}\"!")
+ endif()
+
+ foreach(_file ${_r_deps})
+ file(INSTALL
+ DESTINATION "${inst_run_INSTALL_FOLDER}/bin"
+ TYPE SHARED_LIBRARY
+ FOLLOW_SYMLINK_CHAIN
+ FILES "${_file}"
+ )
+ endforeach()
+ endif()
+ ]]
+ COMPONENT ${inst_run_COMPONENT} CONFIGURATIONS ${CONFIG_TYPE}
+ )
+
+endfunction()
+
+## High level macro to install resources in the correct folder
+##
+## EXECUTABLE: [opt] option to copy files as programs
+## RELATIVE : [opt] copy files relatively to current folder
+## TYPE : [opt] type and folder where to store the files
+## FOLDER : [opt] subfolder to use
+## FILES : [opt] contains a list of resources files to copy to install folder
+macro(ibr_install_rsc target)
+ cmake_parse_arguments(install_rsc_${target} "EXECUTABLE;RELATIVE" "TYPE;FOLDER" "FILES" ${ARGN})
+ set(rsc_target "${target}_${install_rsc_${target}_TYPE}")
+
+ if(install_rsc_${target}_FOLDER)
+ set(rsc_folder "${install_rsc_${target}_TYPE}/${install_rsc_${target}_FOLDER}")
+ else()
+ set(rsc_folder "${install_rsc_${target}_TYPE}")
+ endif()
+
+ add_custom_target(${rsc_target}
+ COMMENT "run the ${install_rsc_${target}_TYPE} installation only for ${target} (component ${rsc_target})"
+ VERBATIM)
+ foreach(scriptFile ${install_rsc_${target}_FILES})
+ if(install_rsc_${target}_RELATIVE)
+ file(RELATIVE_PATH relativeFilename ${CMAKE_CURRENT_SOURCE_DIR} ${scriptFile})
+ else()
+ get_filename_component(relativeFilename ${scriptFile} NAME)
+ endif()
+
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ add_custom_command(TARGET ${rsc_target} POST_BUILD
+ COMMAND ${CMAKE_COMMAND} -E
+ copy_if_different ${scriptFile} ${CMAKE_INSTALL_PREFIX_${CMAKE_BUILD_TYPE}}/${rsc_folder}/${relativeFilename})
+ endif()
+ foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ add_custom_command(TARGET ${rsc_target} POST_BUILD
+ COMMAND ${CMAKE_COMMAND} -E
+ copy_if_different ${scriptFile} ${CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC}}/${rsc_folder}/${relativeFilename})
+ endforeach()
+ endforeach()
+
+ get_target_property(INSTALL_RSC_BUILD_FOLDER ${target} FOLDER)
+ set_target_properties(${rsc_target} PROPERTIES FOLDER ${INSTALL_RSC_BUILD_FOLDER})
+
+ add_dependencies(${target} ${rsc_target})
+ add_dependencies(PREBUILD ${rsc_target})
+
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ resourceFile(ADD ${rsc_folder} CONFIG_TYPE ${CMAKE_BUILD_TYPE} FILE_PATH "${CMAKE_INSTALL_PREFIX_${CMAKE_BUILD_TYPE}}/ibr_resources.ini")
+
+ if(install_rsc_${target}_EXECUTABLE)
+ install(
+ PROGRAMS ${install_rsc_${target}_FILES}
+ CONFIGURATIONS ${CMAKE_BUILD_TYPE}
+ DESTINATION "${CMAKE_INSTALL_PREFIX_${CMAKE_BUILD_TYPE}}/${rsc_folder}"
+ )
+ else()
+ install(
+ FILES ${install_rsc_${target}_FILES}
+ CONFIGURATIONS ${CMAKE_BUILD_TYPE}
+ DESTINATION "${CMAKE_INSTALL_PREFIX_${CMAKE_BUILD_TYPE}}/${rsc_folder}"
+ )
+ endif()
+ endif()
+ foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ resourceFile(ADD ${rsc_folder} CONFIG_TYPE ${CONFIG_TYPES} FILE_PATH "${CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC}}/ibr_resources.ini")
+
+ if(install_rsc_${target}_EXECUTABLE)
+ install(
+ PROGRAMS ${install_rsc_${target}_FILES}
+ CONFIGURATIONS ${CONFIG_TYPES}
+ DESTINATION "${CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC}}/${rsc_folder}"
+ )
+ else()
+ install(
+ FILES ${install_rsc_${target}_FILES}
+ CONFIGURATIONS ${CONFIG_TYPES}
+ DESTINATION "${CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC}}/${rsc_folder}"
+ )
+ endif()
+ endforeach()
+endmacro()
+
+
+## High level macro to install in an homogen way all our ibr targets (it use some functions inside this file)
+##
+## RSC_FILE_ADD : [opt] is used to auto write/append relative paths of target resources into a common file
+## INSTALL_PDB : [opt] is used to auto install PDB file (when using MSVC according to the target type)
+## STANDALONE : [opt] bool ON/OFF var to call install_runtime or not (for bundle resolution)
+## DIRS : [opt] used if STANDALONE set to ON, see install_runtime doc
+## PLUGINS: [opt] used if STANDALONE set to ON, see install_runtime doc
+## MSVC_CMD : [opt] used to specify an absolute filePathName application to launch with the MSVC IDE Debugger associated to this target (project file)
+## MSVC_ARGS : [opt] load the MSVC debugger with correct settings (app path, args, working dir)
+##
+macro(ibr_install_target target)
+ cmake_parse_arguments(ibrInst${target} "VERBOSE;INSTALL_PDB" "COMPONENT;MSVC_ARGS;STANDALONE;RSC_FOLDER" "SHADERS;RESOURCES;SCRIPTS;DIRS;PLUGINS" ${ARGN})
+
+ if(ibrInst${target}_RSC_FOLDER)
+ set(rsc_folder "${ibrInst${target}_RSC_FOLDER}")
+ else()
+ set(rsc_folder "${target}")
+ endif()
+
+ if(ibrInst${target}_SHADERS)
+ ibr_install_rsc(${target} EXECUTABLE TYPE "shaders" FOLDER ${rsc_folder} FILES "${ibrInst${target}_SHADERS}")
+ endif()
+
+ if(ibrInst${target}_RESOURCES)
+ ibr_install_rsc(${target} TYPE "resources" FOLDER ${rsc_folder} FILES "${ibrInst${target}_RESOURCES}")
+ endif()
+
+ if(ibrInst${target}_SCRIPTS)
+ ibr_install_rsc(${target} EXECUTABLE TYPE "scripts" FOLDER ${rsc_folder} FILES "${ibrInst${target}_SCRIPTS}")
+ endif()
+
+ if(ibrInst${target}_COMPONENT)
+ set(installCompArg COMPONENT ${ibrInst${target}_COMPONENT})
+ ## Create a custom install target based on COMPONENT
+ installTargetProject(${target} ${ibrInst${target}_COMPONENT})
+ endif()
+
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ set_target_properties(${target} PROPERTIES ${CMAKE_BUILD_TYPE}_POSTFIX "${CMAKE_${CMAKE_BUILD_TYPE}_POSTFIX}")
+ get_target_property(CURRENT_TARGET_BUILD_TYPE_POSTFIX ${target} ${CMAKE_BUILD_TYPE}_POSTFIX)
+ endif()
+ foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ set_target_properties(${target} PROPERTIES ${CONFIG_TYPES_UC}_POSTFIX "${CMAKE_${CONFIG_TYPES_UC}_POSTFIX}")
+ get_target_property(CURRENT_TARGET_BUILD_TYPE_POSTFIX ${target} ${CONFIG_TYPES_UC}_POSTFIX)
+ endforeach()
+
+ ## Specify default installation rules
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ install(TARGETS ${target}
+ CONFIGURATIONS ${CMAKE_BUILD_TYPE}
+ LIBRARY DESTINATION ${CMAKE_LIBRARY_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}} ${installCompArg}
+ ARCHIVE DESTINATION ${CMAKE_ARCHIVE_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}} ${installCompArg}
+ RUNTIME DESTINATION ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}} ${installCompArg}
+ )
+ install(TARGETS ${target}
+ CONFIGURATIONS ${CMAKE_BUILD_TYPE}
+ LIBRARY DESTINATION ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}} ${installCompArg}
+ ARCHIVE DESTINATION ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}} ${installCompArg}
+ )
+ endif()
+ foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ install(TARGETS ${target}
+ CONFIGURATIONS ${CONFIG_TYPES}
+ LIBRARY DESTINATION ${CMAKE_LIBRARY_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}} ${installCompArg}
+ ARCHIVE DESTINATION ${CMAKE_ARCHIVE_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}} ${installCompArg}
+ RUNTIME DESTINATION ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}} ${installCompArg}
+ )
+ install(TARGETS ${target}
+ CONFIGURATIONS ${CONFIG_TYPES}
+ LIBRARY DESTINATION ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}} ${installCompArg}
+ ARCHIVE DESTINATION ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}} ${installCompArg}
+ )
+ endforeach()
+
+ if(ibrInst${target}_INSTALL_PDB)
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ installPDB(${target} ${CMAKE_BUILD_TYPE}
+ LIBRARY_DEST ${CMAKE_LIBRARY_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}}
+ ARCHIVE_DEST ${CMAKE_ARCHIVE_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}}
+ RUNTIME_DEST ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}}
+ )
+ endif()
+ foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ installPDB(${target} ${CONFIG_TYPES}
+ LIBRARY_DEST ${CMAKE_LIBRARY_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}}
+ ARCHIVE_DEST ${CMAKE_ARCHIVE_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}}
+ RUNTIME_DEST ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}}
+ )
+ endforeach()
+ endif()
+
+ ## install dynamic necessary dependencies
+ if(ibrInst${target}_STANDALONE)
+ get_target_property(type ${target} TYPE)
+ if(${type} MATCHES "EXECUTABLE")
+
+ if(ibrInst${target}_VERBOSE)
+ set(VERBOSE VERBOSE)
+ else()
+ set(VERBOSE )
+ endif()
+
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ install_runtime(bin/${target}${CMAKE_EXECUTABLE_SUFFIX} ## default relative to CMAKE_INSTALL_PREFIX
+ INSTALL_FOLDER "${CMAKE_INSTALL_PREFIX_${CMAKE_BUILD_TYPE}}"
+ CONFIG_TYPE ${CMAKE_BUILD_TYPE}
+ ${VERBOSE}
+ TARGET ${target}
+ ${installCompArg}
+ PLUGINS ## will be installed
+ ${ibrInst${target}_PLUGINS}
+ DIRS ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}}
+ ${ibrInst${target}_DIRS}
+ )
+ endif()
+ foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ install_runtime(bin/${target}${CMAKE_EXECUTABLE_SUFFIX} ## default relative to CMAKE_INSTALL_PREFIX
+ INSTALL_FOLDER "${CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC}}"
+ CONFIG_TYPE ${CONFIG_TYPES}
+ ${VERBOSE}
+ TARGET ${target}
+ ${installCompArg}
+ PLUGINS ## will be installed
+ ${ibrInst${target}_PLUGINS}
+ DIRS ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}}
+ ${ibrInst${target}_DIRS}
+ )
+ endforeach()
+ else()
+ message(WARNING "STANDALONE option is only compatible with EXECUTABLES target type. Skip the STANDALONE installation process.")
+ endif()
+ endif()
+
+ ## Provide a way to directly load the MSVC debugger with correct settings
+ if(MSVC)
+ if(ibrInst${target}_MSVC_CMD) ## command absolute filePathName is optional as the default is to use the installed target file application
+ set(msvcCmdArg COMMAND ${ibrInst${target}_MSVC_CMD}) ## flag following by the value (both to pass to the MSVCsetUserCommand function)
+ endif()
+ if(ibrInst${target}_MSVC_ARGS) ## args (between quotes) are optional
+ set(msvcArgsArg ARGS ${ibrInst${target}_MSVC_ARGS}) ## flag following by the value (both to pass to the MSVCsetUserCommand function)
+ endif()
+ get_target_property(type ${target} TYPE)
+ if( (ibrInst${target}_MSVC_CMD OR ibrInst${target}_MSVC_ARGS) OR (${type} MATCHES "EXECUTABLE") )
+ include(MSVCsetUserCommand)
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ MSVCsetUserCommand( ${target}
+ PATH ${CMAKE_OUTPUT_BIN_${CMAKE_BUILD_TYPE}} ##FILE option not necessary since it deduced from targetName
+ ARGS "${SIBR_PROGRAMARGS}"
+ ${msvcCmdArg}
+ #${msvcArgsArg}
+ WORKING_DIR ${CMAKE_OUTPUT_BIN_${CMAKE_BUILD_TYPE}}
+ )
+ endif()
+ foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ MSVCsetUserCommand( ${target}
+ PATH ${CMAKE_OUTPUT_BIN_${CONFIG_TYPES_UC}} ##FILE option not necessary since it deduced from targetName
+ ARGS "${SIBR_PROGRAMARGS}"
+ ${msvcCmdArg}
+ #${msvcArgsArg}
+ WORKING_DIR ${CMAKE_OUTPUT_BIN_${CONFIG_TYPES_UC}}
+ )
+ endforeach()
+ elseif(NOT ${type} MATCHES "EXECUTABLE")
+ #message("Cannot set MSVCsetUserCommand with target ${target} without COMMAND parameter as it is not an executable (skip it)")
+ endif()
+ endif()
+
+endmacro()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/parse_arguments_multi.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/parse_arguments_multi.cmake
new file mode 100644
index 0000000..4f19e41
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/parse_arguments_multi.cmake
@@ -0,0 +1,304 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+if(NOT WIN32 OR __parse_arguments_multi_cmake_INCLUDED__)
+ return()
+else()
+ set(__parse_arguments_multi_cmake_INCLUDED__ ON)
+endif()
+
+## This macro allow to process repeating multi value args from a given function which use cmake_parse_arguments module.
+##
+## cmake_parse_arguments multi args standard behavior:
+## function(foo)
+## cmake_parse_arguments(arg "" "" "MULTI" ${ARGN})
+## foreach(item IN LISTS arg_MULTI)
+## message(STATUS "${item}")
+## endforeach()
+## endfunction()
+## foo(MULTI x y MULTI z w)
+## The above code outputs 'z' and 'w'. It originally expected it to output all of 'x' 'y' 'z' 'w'.
+##
+## Using this macro inside a function which want to handle repeating multi args values
+## will recursively iterate onto the multi tags list to process each sub list.
+## It take as 1st argument the subTag flag to separate sub list from the main multi list.
+## It take as 2nd argument the nameList of the main multi list (the multiValuesArgs from cmake_parse_arguments: here it is MULTI in the example)
+## and that's why it is important that it should be a macro and not a function (to get access to external variable).
+## Then you give the content of this list allowing to be processed by the macro.
+##
+## parse_arguments_multi macro call a parse_arguments_multi_function which do actually the process from the given sub-list.
+## By default this function only print infos about what variables you are trying to pass/process (only verbose messages),
+## but, by overloading this cmake function, you will be able to externalize the process of your multi argument list.
+##
+## Usage (into a function) :
+## parse_arguments_multi(
+## [NEED_RESULTS ] [EXTRAS_FLAGS <...> <...> ...]
+## )
+##
+## Simple usage example [user point of view]:
+## foo(MULTI
+## SUB_MULTI x y
+## SUB_MULTI z w
+## )
+##
+## Simple usage example [inside a function]:
+## function(foo)
+## cmake_parse_arguments(arg "" "" "MULTI" ${ARGN})
+## include(parse_arguments_multi)
+## function(parse_arguments_multi_function )
+## #message("I'm an overloaded cmake function used by parse_arguments_multi")
+## #message("I'm processing first part of my sub list: ${ARGN}")
+## message("ARGV0=${ARGV0}")
+## message("ARGV1=${ARGV1}")
+## endfunction()
+## parse_arguments_multi(SUB_MULTI arg_MULTI ${arg_MULTI}) ## this function will process recusively items of the sub-list [default print messages]
+## endfunction()
+##
+## Will print:
+## ARGV0=z
+## ARGV1=w
+## ARGV0=x
+## ARGV1=y
+##
+## WARNING : DO NEVER ADD EXTRA THINGS TO parse_arguments_multi MACRO :
+## parse_arguments_multi(SUB_MULTI arg_MULTI ${arg_MULTI} EXTRAS foo bar SOMTHING) => will failed !!
+## use EXTRAS_FLAGS instead !!
+##
+## Advanced usage example [user point of view]:
+## bar(C:/prout/test.exe VERBOSE
+## PLUGINS
+## PLUGIN_PATH_NAME x PLUGIN_PATH_DEST w
+## PLUGIN_PATH_NAME a b PLUGIN_PATH_DEST y
+## PLUGIN_PATH_NAME c
+## )
+##
+## Advanced usage example [inside a function]:
+## function(bar execFilePathName)
+## cmake_parse_arguments(arg "VERBOSE" "" "PLUGINS" ${ARGN})
+##
+## include(parse_arguments_multi)
+## function(parse_arguments_multi_function results)
+## cmake_parse_arguments(pamf "VERBOSE" "PLUGIN_PATH_DEST;EXEC_PATH" "" ${ARGN}) ## EXEC_PATH is for internal use
+## message("")
+## message("I'm an overloaded cmake function used by parse_arguments_multi from install_runtime function")
+## message("I'm processing first part of my sub list: ${ARGN}")
+## message("PLUGIN_PATH_NAME = ${pamf_UNPARSED_ARGUMENTS}")
+## message(pamf_VERBOSE = ${pamf_VERBOSE})
+## message("pamf_PLUGIN_PATH_DEST = ${pamf_PLUGIN_PATH_DEST}")
+## message(pamf_EXEC_PATH = ${pamf_EXEC_PATH})
+## if(NOT ${pamf_PLUGIN_PATH_DEST})
+## set(pamf_PLUGIN_PATH_DEST ${pamf_EXEC_PATH})
+## endif()
+## foreach(plugin ${pamf_UNPARSED_ARGUMENTS})
+## get_filename_component(pluginName ${plugin} NAME)
+## list(APPEND pluginsList ${pamf_PLUGIN_PATH_DEST}/${pluginName})
+## endforeach()
+## set(${results} ${pluginsList} PARENT_SCOPE)
+## endfunction()
+##
+## if(arg_VERBOSE)
+## list(APPEND extra_flags_to_add VERBOSE) ## here we transmit the VERNOSE flag
+## endif()
+## get_filename_component(EXEC_PATH ${execFilePathName} PATH) ## will be the default value if PLUGIN_PATH_DEST option is not provided
+## list(APPEND extra_flags_to_add EXEC_PATH ${EXEC_PATH})
+## list(LENGTH arg_PLUGINS arg_PLUGINS_count)
+## parse_arguments_multi(PLUGIN_PATH_NAME arg_PLUGINS ${arg_PLUGINS}
+## NEED_RESULTS ${arg_PLUGINS_count} ## this is used to check when we are in the first loop (in order to reset parse_arguments_multi_results)
+## EXTRAS_FLAGS ${extra_flags_to_add} ## this is used to allow catching VERBOSE and PLUGIN_PATH_DEST flags of our overloaded function
+## )
+## endfunction()
+## message(parse_arguments_multi_results = ${parse_arguments_multi_results}) ## list of the whole pluginsList
+## #Will print w/x;a/y;b/y;C:/prout/c
+##
+## NOTE that here, since our overloaded function need to provide a result list, we use the other parse_arguments_multi_function signature (the which one with a results arg)
+##
+
+function(parse_arguments_multi_function_default) ## used in case of you want to reset the default behavior of this function process
+ message("[default function] parse_arguments_multi_function(ARGC=${ARGC} ARGV=${ARGV} ARGN=${ARGN})")
+ message("This function is used by parse_arguments_multi and have to be overloaded to process sub list of multi values args")
+endfunction()
+
+function(parse_arguments_multi_function ) ## => the function to overload
+ parse_arguments_multi_function_default(${ARGN})
+endfunction()
+
+## first default signature above
+##------------------------------
+## second results signature behind
+
+function(parse_arguments_multi_function_default result) ## used in case of you want to reset the default behavior of this function process
+ message("[default function] parse_arguments_multi_function(ARGC=${ARGC} ARGV=${ARGV} ARGN=${ARGN})")
+ message("This function is used by parse_arguments_multi and have to be overloaded to process sub list of muluti values args")
+endfunction()
+
+function(parse_arguments_multi_function result) ## => the function to overload
+ parse_arguments_multi_function_default(result ${ARGN})
+endfunction()
+
+## => the macro to use inside your function which use cmake_parse_arguments
+# NOTE: entry point of parse_arguments_multi, which is called from win3rdPart)
+macro(parse_arguments_multi multiArgsSubTag multiArgsList #<${multiArgsList}> the content of the list
+)
+ # message (STATUS "")
+ # message(STATUS "calling parse_arguemnts_multi defined in parse_arguments_multi.cmake:141")
+ # message(STATUS "multiArgsSubTag = ${multiArgsSubTag}") # CHECK_CACHED_VAR
+ # message(STATUS "multiArgsList = ${multiArgsList}") # it contains the name of the variable which is holding the list i.e w3p_MULTI_SET
+ # message(STATUS "value of ${multiArgsList} = ${${multiArgsList}}") # a semicolon separated list of values passed to SET or MULTISET keyword in win3rdParty
+ # message(STATUS "actual values ARGN = ${ARGN}") # the same as ${${multiArgsList}}
+
+ ## INFO
+ ## starting from CMake 3.5 cmake_parse_arguments is not a module anymore and now is a native CMake command.
+ ## the behaviour is different though
+ ## In CMake 3.4, if you pass multiple times a multi_value_keyword, CMake returns the values of the LAST match
+ ## In CMake 3.5 and above, CMake returns the whole list of values that were following that multi_value_keyword
+ ## example:
+ ## cmake_parse_arguments(
+ ##
+ ## "" # options
+ ## "" # one value keywords
+ ## "MY_MULTI_VALUE_TAG"
+ ## MY_MULTI_VALUE_TAG value1 value2
+ ## MY_MULTI_VALUE_TAG value3 value4
+ ## MY_MULTI_VALUE_TAG value5 value6
+ ## )
+ ## result in CMake 3.4
+ ## _MY_MULTI_VALUE_TAG = "value5;value6"
+ ##
+ ## result in CMake 3.8
+ ## _MY_MULTI_VALUE_TAG = "value5;value6"
+
+ #include(CMakeParseArguments) #module CMakeParseArguments is obsolete since cmake 3.5
+ # cmake_parse_arguments ( args)
+ # : options (flags) pass to the macro
+ # : options that neeed a value
+ # : options that neeed more than one value
+ cmake_parse_arguments(_pam "" "NEED_RESULTS" "${multiArgsSubTag};EXTRAS_FLAGS" ${ARGN})
+
+ ## multiArgsList is the name of the list used by the multiValuesOption flag from the cmake_parse_arguments of the user function
+ ## that's why we absolutly need to use MACRO here (and also for passing parse_arguments_multi_results when NEED_RESULTS flag is set)
+
+ ## for debugging
+ #message("")
+ #message("[parse_arguments_multi] => ARGN = ${ARGN}")
+ #message("_pam_NEED_RESULTS=${_pam_NEED_RESULTS}")
+ #message("_pam_EXTRAS_FLAGS=${_pam_EXTRAS_FLAGS}")
+ # foreach(var ${_pam_${multiArgsSubTag}})
+ # message("arg=${var}")
+ # endforeach()
+
+ if (${CMAKE_VERSION} VERSION_GREATER "3.5")
+ # lets make ${_pam_${multiArgsSubTag}} behave as it is in version 3.4
+ # that means, cmake_parse_arguments should have only the last values of a multi set for a given keyword
+
+ # message("")
+ # message("values in multiArgsList")
+ # foreach(val ${${multiArgsList}})
+ # message(STATUS ${val})
+ # endforeach()
+ # message("end values in multiArgsList")
+
+
+ set(lastIndexFound OFF)
+ list(LENGTH ${multiArgsList} argnLength)
+ # message(${argnLength})
+ math(EXPR argnLength "${argnLength}-1") # make last index a valid one
+ set(recordIndex 0)
+ set(records "") # clear records list
+ set(record0 "") # clear first record list
+ foreach(iter RANGE ${argnLength})
+ list(GET ${multiArgsList} ${iter} value)
+ # message(STATUS "index=${iter} value=${value}")
+ if (${value} STREQUAL ${multiArgsSubTag})
+ if (lastIndexFound)
+ list(APPEND records ${recordIndex}) # records store the list NAMES
+ math(EXPR recordIndex "${recordIndex}+1")
+ set(record${recordIndex} "") # clear record list
+ else ()
+ set(lastIndexFound ON)
+ endif()
+
+ set(lastIndex ${iter})
+ else ()
+ if (lastIndexFound)
+ # message(${value})
+ list(APPEND record${recordIndex} ${value})
+ endif()
+ endif()
+ endforeach()
+
+ # save the last list of values
+ if (lastIndexFound)
+ list(APPEND records ${recordIndex}) # records store the list NAMES
+ endif()
+
+ # set multiArgsList to make it behave like CMake 3.4
+ # message("")
+ # message("using my records")
+ foreach(recordName ${records})
+ # message(${recordName})
+ # foreach(value ${record${recordName}})
+ # message(${value})
+ # endforeach()
+ # message("")
+ set(_pam_${multiArgsSubTag} ${record${recordName}})
+ endforeach()
+ # message(${_pam_${multiArgsSubTag}})
+
+ # message("")
+ # message("using argn")
+ # foreach(value ${ARGN})
+ # message(${value})
+ # endforeach()
+ endif() # end if cmake > 3.5
+
+ # message("values with pam ${_pam_${multiArgsSubTag}}")
+
+ ## check and init
+ list(LENGTH ${multiArgsList} globalListCount) # GLUT_TRACE: globalListCound=16 in CMake3.4 and CMake3.8
+ # message(STATUS "nr items in multiArgsList: ${globalListCount}")
+ math(EXPR globalListCount "${globalListCount}-1") ## because it will contain [multiArgsSubTag + ${multiArgsList}]
+ if(_pam_NEED_RESULTS)
+ if(${globalListCount} EQUAL ${_pam_NEED_RESULTS})
+ ## first time we enter into this macro (because we call it recursively)
+ unset(parse_arguments_multi_results)
+ endif()
+ endif()
+
+ ## process the part of the multi agrs list
+ ## ${ARGN} shouldn't be passed to the function in order to avoid missmatch size list ${multiArgsList} and _pam_${multiArgsSubTag}
+ ## if you want to pass extra internal flags from your function to this callback, use EXTRAS_FLAGS
+ if(_pam_NEED_RESULTS)
+ parse_arguments_multi_function(parse_arguments_multi_function_result ${_pam_${multiArgsSubTag}} ${_pam_EXTRAS_FLAGS})
+ list(APPEND parse_arguments_multi_results ${parse_arguments_multi_function_result})
+ else()
+ # message(STATUS "about to call parse_arguments_multi_function in parse_arguments_multi.cmake:177 ${_pam_${multiArgsSubTag}} and extra flags ${_pam_EXTRAS_FLAGS}")
+ parse_arguments_multi_function(${_pam_${multiArgsSubTag}} ${_pam_EXTRAS_FLAGS})
+ endif()
+
+ ## remove just processed items from the main list to process (multiArgsList)
+ list(REVERSE ${multiArgsList})
+ list(LENGTH _pam_${multiArgsSubTag} subTagListCount)
+ unset(ids)
+ foreach(id RANGE ${subTagListCount})
+ list(APPEND ids ${id})
+ endforeach()
+ list(REMOVE_AT ${multiArgsList} ${ids})
+ list(REVERSE ${multiArgsList})
+
+ ## test if remain sub multi list to process (recursive call) or finish the process
+ list(LENGTH ${multiArgsList} mainTagListCount)
+ if(${mainTagListCount} GREATER 1)
+ ## do not pass ${ARGN} just because it will re pass the initial 2 inputs args and we wont as they was consumed (in order to avoir conflicts)
+ # message(STATUS "about to call a parse_arguments_multi but without knowing where the definition is going to be taken from")
+ parse_arguments_multi(${multiArgsSubTag} ${multiArgsList} ${${multiArgsList}}
+ NEED_RESULTS ${_pam_NEED_RESULTS} EXTRAS_FLAGS ${_pam_EXTRAS_FLAGS}
+ )
+ endif()
+endmacro()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/sibr_library.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/sibr_library.cmake
new file mode 100644
index 0000000..08a30ad
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/linux/sibr_library.cmake
@@ -0,0 +1,174 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+# NOTE
+# This feature is used to easily download, store and link external dependencies. This
+# requires to prepare pre-compiled libraries (to download). For now, packages have
+# only be prepare for Windows 64-bit with Visual Studio 2012. (You should re-build
+# everything if you want to use another version of Visual Studio/ another compiler).
+
+# NOTE ABOUT UNIX SYSTEMS
+# There is no need for "searching mechanism". This function is discard and your
+# libraries should be installed is the standard folders that are:
+#
+# /usr/include/
+# /usr/lib/
+# /usr/lib64/
+# for packages downloaded using apt-get/yum
+#
+# /usr/local/include/
+# /usr/local/lib/
+# /usr/local/lib64/
+# for packages manually installed ("make install")
+#
+# if you encounter problems when linking (e.g. lib not found even if it is installed),
+# please check these folders are in your search PATH environment variables.
+
+set(EXTLIBS_PACKAGE_FOLDER "${CMAKE_SOURCE_DIR}/extlibs")
+
+function(sibr_addlibrary)
+ if(NOT WIN32)
+ return()
+ endif()
+
+ file(MAKE_DIRECTORY ${EXTLIBS_PACKAGE_FOLDER})
+ cmake_parse_arguments(args "VCID" "VERBOSE;TIMEOUT;DEFAULT_USE;NAME;VERSION;MSVC11;MSVC12;MSVC14;MSVC17" "MULTI_SET;SET" ${ARGN})
+
+
+ if (NOT "${args_VERSION}" MATCHES "")
+ message(WARNING "VERSION is not implemented yet")
+ endif()
+
+ set(lcname "")
+ set(ucname "")
+ string(TOLOWER "${args_NAME}" lcname)
+ string(TOUPPER "${args_NAME}" ucname)
+
+ set(LIB_PACKAGE_FOLDER "${EXTLIBS_PACKAGE_FOLDER}/${lcname}")
+ win3rdParty(${ucname}
+ $
+ VERBOSE ${args_VERBOSE}
+ TIMEOUT ${args_TIMEOUT}
+ DEFAULT_USE ${args_DEFAULT_USE}
+ MSVC11 "${LIB_PACKAGE_FOLDER}" "${args_MSVC11}"
+ MSVC12 "${LIB_PACKAGE_FOLDER}" "${args_MSVC12}"
+ MSVC14 "${LIB_PACKAGE_FOLDER}" "${args_MSVC14}" # TODO SV: make sure to build this library if required
+ MSVC17 "${LIB_PACKAGE_FOLDER}" "${args_MSVC17}"
+ SET ${args_SET}
+ MULTI_SET ${args_MULTI_SET}
+ )
+
+ # Add include/ directory
+ # and lib/ directories
+
+ # TODO SV: paths not matching with current hierarchy. example: libraw/libraw-0.17.1/include
+ # SR: The link directories will also be used to lookup for dependency DLLs to copy in the install directory.
+ # Some libraries put the DLLs in the bin/ directory, so we include those.
+ file(GLOB subdirs RELATIVE ${LIB_PACKAGE_FOLDER} ${LIB_PACKAGE_FOLDER}/*)
+ set(dirlist "")
+ foreach(dir ${subdirs})
+ if(IS_DIRECTORY ${LIB_PACKAGE_FOLDER}/${dir})
+ # message("adding ${LIB_PACKAGE_FOLDER}/${dir}/include/ to the include directories")
+ include_directories("${LIB_PACKAGE_FOLDER}/${dir}/include/")
+ # message("adding ${LIB_PACKAGE_FOLDER}/${dir}/lib[64] to the link directories")
+ link_directories("${LIB_PACKAGE_FOLDER}/${dir}/")
+ link_directories("${LIB_PACKAGE_FOLDER}/${dir}/lib/")
+ link_directories("${LIB_PACKAGE_FOLDER}/${dir}/lib64/")
+ link_directories("${LIB_PACKAGE_FOLDER}/${dir}/bin/")
+ endif()
+ endforeach()
+
+endfunction()
+
+include(FetchContent)
+include(git_describe)
+include(install_runtime)
+
+function(sibr_gitlibrary)
+ cmake_parse_arguments(args "" "TARGET;GIT_REPOSITORY;GIT_TAG;ROOT_DIR;SOURCE_DIR" "INCLUDE_DIRS" ${ARGN})
+ if(NOT args_TARGET)
+ message(FATAL "Error on sibr_gitlibrary : please define your target name.")
+ return()
+ endif()
+
+ if(NOT args_ROOT_DIR)
+ set(args_ROOT_DIR ${args_TARGET})
+ endif()
+
+ if(NOT args_SOURCE_DIR)
+ set(args_SOURCE_DIR ${args_TARGET})
+ endif()
+
+ if(args_GIT_REPOSITORY AND args_GIT_TAG)
+ if(EXISTS ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/${args_SOURCE_DIR}/.git)
+ git_describe(
+ PATH ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/${args_SOURCE_DIR}
+ GIT_URL SIBR_GITLIBRARY_URL
+ GIT_BRANCH SIBR_GITLIBRARY_BRANCH
+ GIT_COMMIT_HASH SIBR_GITLIBRARY_COMMIT_HASH
+ GIT_TAG SIBR_GITLIBRARY_TAG
+ )
+
+ if((SIBR_GITLIBRARY_URL STREQUAL args_GIT_REPOSITORY) AND
+ ((SIBR_GITLIBRARY_BRANCH STREQUAL args_GIT_TAG) OR
+ (SIBR_GITLIBRARY_TAG STREQUAL args_GIT_TAG) OR
+ (SIBR_GITLIBRARY_COMMIT_HASH STREQUAL args_GIT_TAG)))
+ message(STATUS "Library ${args_TARGET} already available, skipping.")
+ set(SIBR_GITLIBRARY_DECLARED ON)
+ else()
+ message(STATUS "Adding library ${args_TARGET} from git...")
+ endif()
+ endif()
+
+ FetchContent_Declare(${args_TARGET}
+ GIT_REPOSITORY ${args_GIT_REPOSITORY}
+ GIT_TAG ${args_GIT_TAG}
+ GIT_SHALLOW ON
+ SOURCE_DIR ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/${args_SOURCE_DIR}
+ SUBBUILD_DIR ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/subbuild
+ BINARY_DIR ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/build
+ )
+ FetchContent_GetProperties(${args_TARGET})
+ string(TOLOWER "" lcTargetName)
+
+ if((NOT SIBR_GITLIBRARY_DECLARED) AND (NOT ${lcTargetName}_POPULATED))
+ message(STATUS "Populating library ${args_TARGET}...")
+ FetchContent_Populate(${args_TARGET} QUIET
+ GIT_REPOSITORY ${args_GIT_REPOSITORY}
+ GIT_TAG ${args_GIT_TAG}
+ SOURCE_DIR ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/${args_SOURCE_DIR}
+ SUBBUILD_DIR ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/subbuild
+ BINARY_DIR ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/build
+ )
+ endif()
+
+ add_subdirectory(${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/${args_SOURCE_DIR} ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/build)
+
+ get_target_property(type ${args_TARGET} TYPE)
+ if(NOT (type STREQUAL "INTERFACE_LIBRARY"))
+ set_target_properties(${args_TARGET} PROPERTIES FOLDER "extlibs")
+
+ ibr_install_target(${args_TARGET}
+ COMPONENT ${args_TARGET}_install ## will create custom target to install only this project
+ )
+ endif()
+
+ list(APPEND ${args_TARGET}_INCLUDE_DIRS ${EXTLIBS_PACKAGE_FOLDER}/${args_ROOT_DIR})
+ list(APPEND ${args_TARGET}_INCLUDE_DIRS ${EXTLIBS_PACKAGE_FOLDER}/${args_ROOT_DIR}/${args_SOURCE_DIR})
+
+ foreach(args_INCLUDE_DIR ${args_INCLUDE_DIRS})
+ list(APPEND ${args_TARGET}_INCLUDE_DIRS ${EXTLIBS_PACKAGE_FOLDER}/${args_ROOT_DIR}/${args_SOURCE_DIR}/${args_INCLUDE_DIR})
+ endforeach()
+
+ include_directories(${${args_TARGET}_INCLUDE_DIRS})
+ else()
+ message(FATAL "Error on sibr_gitlibrary for target ${args_TARGET}: missing git tag or git url.")
+ endif()
+endfunction()
\ No newline at end of file
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/MSVCsetUserCommand.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/MSVCsetUserCommand.cmake
new file mode 100644
index 0000000..bc49770
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/MSVCsetUserCommand.cmake
@@ -0,0 +1,149 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+if(__MSVCsetUserCommand_cmake_INCLUDED__)
+ return()
+else()
+ set(__MSVCsetUserCommand_cmake_INCLUDED__ ON)
+endif()
+
+## Allow to configure the Debugger settings of visual studio
+## Note: Using this command under linux doesn't affect anything
+## On run Debug Windows local : visual will try to load a specific COMMAND with ARGS in the provided WORKING_DIR
+##
+## usage:
+## MSVCsetUserCommand(
+## [COMMAND | [ PATH [FILE ] ] ]
+## ARGS
+## WORKING_DIR
+## )
+##
+## Warning 1 : All arugments () must be passed under quotes
+## Warning 2 : WORKING_DIR path arg have to finish with remain slah '/'
+## Warning 3 : use COMMAND for external app OR PATH (optionaly with FILE) option(s) to set your built/installed/moved target
+##
+## Example 1:
+## include(MSVCsetUserCommand)
+## MSVCsetUserCommand( UnityRenderingPlugin
+## COMMAND "C:/Program Files (x86)/Unity/Editor/Unity.exe"
+## ARGS "-force-opengl -projectPath \"${CMAKE_HOME_DIRECTORY}/UnityPlugins/RenderingPluginExample/UnityProject\""
+## WORKING_DIR "${CMAKE_HOME_DIRECTORY}/UnityPlugins/RenderingPluginExample/UnityProject"
+## VERBOSE
+## )
+##
+## Example 2:
+## include(MSVCsetUserCommand)
+## MSVCsetUserCommand( ibrApp
+## PATH "C:/Program Files (x86)/workspace/IBR/install"
+## FILE "ibrApp${CMAKE_EXECUTABLE_SUFFIX}" ## this option line is optional since the target name didn't change between build and install step
+## ARGS "-path \"${CMAKE_HOME_DIRECTORY}/dataset\""
+## WORKING_DIR "${CMAKE_HOME_DIRECTORY}"
+## VERBOSE
+## )
+##
+function(MSVCsetUserCommand targetName)
+ cmake_parse_arguments(MSVCsuc "VERBOSE" "PATH;FILE;COMMAND;ARGS;WORKING_DIR" "" ${ARGN} )
+
+ ## If no arguments are given, do not create an unecessary .vcxproj.user file
+ set(MSVCsuc_DEFAULT OFF)
+
+ if(MSVCsuc_PATH AND MSVCsuc_DEFAULT)
+ set(MSVCsuc_DEFAULT OFF)
+ endif()
+
+ if(MSVCsuc_FILE AND MSVCsuc_DEFAULT)
+ set(MSVCsuc_DEFAULT OFF)
+ endif()
+
+ if(NOT MSVCsuc_COMMAND)
+ if(MSVCsuc_PATH AND MSVCsuc_FILE)
+ set(MSVCsuc_COMMAND "${MSVCsuc_PATH}\\${MSVCsuc_FILE}")
+ elseif(MSVCsuc_PATH)
+ set(MSVCsuc_COMMAND "${MSVCsuc_PATH}\\$(TargetFileName)")
+ else()
+ set(MSVCsuc_COMMAND "$(TargetPath)") ## => $(TargetDir)\$(TargetName)$(TargetExt)
+ endif()
+ elseif(MSVCsuc_DEFAULT)
+ set(MSVCsuc_DEFAULT OFF)
+ endif()
+
+ # NOTE: there was a typo here. there is an else if written after else statement
+ # changing the order of the else if statement
+ if(MSVCsuc_WORKING_DIR)
+ file(TO_NATIVE_PATH ${MSVCsuc_WORKING_DIR} MSVCsuc_WORKING_DIR)
+ elseif(MSVCsuc_DEFAULT)
+ set(MSVCsuc_DEFAULT OFF)
+ else()
+ set(MSVCsuc_WORKING_DIR "$(ProjectDir)")
+ endif()
+
+ if(NOT MSVCsuc_ARGS)
+ set(MSVCsuc_ARGS "")
+ elseif(MSVCsuc_DEFAULT)
+ set(MSVCsuc_DEFAULT OFF)
+ endif()
+
+ if(MSVC10 OR (MSVC AND MSVC_VERSION GREATER 1600)) # 2010 or newer
+
+ if(CMAKE_SIZEOF_VOID_P EQUAL 8)
+ set(PLATEFORM_BITS x64)
+ else()
+ set(PLATEFORM_BITS Win32)
+ endif()
+
+ if(NOT MSVCsuc_DEFAULT AND PLATEFORM_BITS)
+
+ file(WRITE "${CMAKE_CURRENT_BINARY_DIR}/${targetName}.vcxproj.user"
+ "
+
+
+ ${MSVCsuc_COMMAND}
+ ${MSVCsuc_ARGS}
+ WindowsLocalDebugger
+ ${MSVCsuc_WORKING_DIR}
+
+
+ ${MSVCsuc_COMMAND}
+ ${MSVCsuc_ARGS}
+ WindowsLocalDebugger
+ ${MSVCsuc_WORKING_DIR}
+
+
+ ${MSVCsuc_COMMAND}
+ ${MSVCsuc_ARGS}
+ WindowsLocalDebugger
+ ${MSVCsuc_WORKING_DIR}
+
+
+ ${MSVCsuc_COMMAND}
+ ${MSVCsuc_ARGS}
+ WindowsLocalDebugger
+ ${MSVCsuc_WORKING_DIR}
+
+"
+ )
+ if(MSVCsuc_VERBOSE)
+ message(STATUS "[MSVCsetUserCommand] Write ${CMAKE_CURRENT_BINARY_DIR}/${targetName}.vcxproj.user file")
+ message(STATUS " to execute ${MSVCsuc_COMMAND} ${MSVCsuc_ARGS}")
+ message(STATUS " from derectory ${MSVCsuc_WORKING_DIR}")
+ message(STATUS " on visual studio run debugger button")
+ endif()
+
+ else()
+ message(WARNING "PLATEFORM_BITS is undefined...")
+ endif()
+
+ else()
+ if(MSVCsuc_VERBOSE)
+ message(WARNING "MSVCsetUserCommand is disable because too old MSVC is used (need MSVC10 2010 or newer)")
+ endif()
+ endif()
+
+endfunction()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/Modules/FindASSIMP.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/Modules/FindASSIMP.cmake
new file mode 100644
index 0000000..f92c8c0
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/Modules/FindASSIMP.cmake
@@ -0,0 +1,104 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+## Try to find the ASSIMP library
+## Once done this will define
+##
+## ASSIMP_FOUND - system has ASSIMP
+## ASSIMP_INCLUDE_DIR - The ASSIMP include directory
+## ASSIMP_LIBRARIES - The libraries needed to use ASSIMP
+## ASSIMP_CMD - the full path of ASSIMP executable
+## ASSIMP_DYNAMIC_LIB - the Assimp dynamic lib (available only on windows as .dll file for the moment)
+##
+## Edited for using a bugfixed version of Assimp
+
+if(NOT ASSIMP_DIR)
+ set(ASSIMP_DIR "$ENV{ASSIMP_DIR}" CACHE PATH "ASSIMP root directory")
+endif()
+if(ASSIMP_DIR)
+ file(TO_CMAKE_PATH ${ASSIMP_DIR} ASSIMP_DIR)
+endif()
+
+
+## set the LIB POSTFIX to find in a right directory according to what kind of compiler we use (32/64bits)
+if(CMAKE_SIZEOF_VOID_P EQUAL 8)
+ set(ASSIMP_SEARCH_LIB "lib64")
+ set(ASSIMP_SEARCH_BIN "bin64")
+ set(ASSIMP_SEARCH_LIB_PATHSUFFIXE "x64")
+else()
+ set(ASSIMP_SEARCH_LIB "lib32")
+ set(ASSIMP_SEARCH_BIN "bin32")
+ set(ASSIMP_SEARCH_LIB_PATHSUFFIXE "x86")
+endif()
+
+set(PROGRAMFILESx86 "PROGRAMFILES(x86)")
+
+
+FIND_PATH(ASSIMP_INCLUDE_DIR
+ NAMES assimp/config.h
+ PATHS
+ ${ASSIMP_DIR}
+ ## linux
+ /usr
+ /usr/local
+ /opt/local
+ ## windows
+ "$ENV{PROGRAMFILES}/Assimp"
+ "$ENV{${PROGRAMFILESx86}}/Assimp"
+ "$ENV{ProgramW6432}/Assimp"
+ PATH_SUFFIXES include
+)
+
+
+FIND_LIBRARY(ASSIMP_LIBRARY
+ NAMES assimp-vc140-mt
+ PATHS
+ ${ASSIMP_DIR}/${ASSIMP_SEARCH_LIB}
+ ${ASSIMP_DIR}/lib
+ ${ASSIMP_DIR}/lib64
+ ## linux
+ /usr/${ASSIMP_SEARCH_LIB}
+ /usr/local/${ASSIMP_SEARCH_LIB}
+ /opt/local/${ASSIMP_SEARCH_LIB}
+ /usr/lib
+ /usr/local/lib
+ /opt/local/lib
+ ## windows
+ "$ENV{PROGRAMFILES}/Assimp/${ASSIMP_SEARCH_LIB}"
+ "$ENV{${PROGRAMFILESx86}}/Assimp/${ASSIMP_SEARCH_LIB}"
+ "$ENV{ProgramW6432}/Assimp/${ASSIMP_SEARCH_LIB}"
+ "$ENV{PROGRAMFILES}/Assimp/lib"
+ "$ENV{${PROGRAMFILESx86}}/Assimp/lib"
+ "$ENV{ProgramW6432}/Assimp/lib"
+ PATH_SUFFIXES ${ASSIMP_SEARCH_LIB_PATHSUFFIXE}
+)
+set(ASSIMP_LIBRARIES ${ASSIMP_LIBRARY})
+
+
+if(ASSIMP_LIBRARY)
+ get_filename_component(ASSIMP_LIBRARY_DIR ${ASSIMP_LIBRARY} PATH)
+ file(GLOB ASSIMP_DYNAMIC_LIB "${ASSIMP_LIBRARY_DIR}/assimp*.dll")
+ if(NOT ASSIMP_DYNAMIC_LIB)
+ message("ASSIMP_DYNAMIC_LIB is missing... at ${ASSIMP_LIBRARY_DIR}")
+ endif()
+ set(ASSIMP_DYNAMIC_LIB ${ASSIMP_DYNAMIC_LIB} CACHE PATH "Windows dll location")
+endif()
+
+MARK_AS_ADVANCED(ASSIMP_DYNAMIC_LIB ASSIMP_INCLUDE_DIR ASSIMP_LIBRARIES)
+
+INCLUDE(FindPackageHandleStandardArgs)
+FIND_PACKAGE_HANDLE_STANDARD_ARGS(ASSIMP
+ REQUIRED_VARS ASSIMP_INCLUDE_DIR ASSIMP_LIBRARIES
+ FAIL_MESSAGE "ASSIMP wasn't found correctly. Set ASSIMP_DIR to the root SDK installation directory."
+)
+
+if(NOT ASSIMP_FOUND)
+ set(ASSIMP_DIR "" CACHE STRING "Path to ASSIMP install directory")
+endif()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/Modules/FindEmbree.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/Modules/FindEmbree.cmake
new file mode 100644
index 0000000..27908b5
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/Modules/FindEmbree.cmake
@@ -0,0 +1,95 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+## Important Note:
+## This is not an official Find*cmake. It has been written for searching through
+## a custom path (EMBREE_DIR) before checking elsewhere.
+##
+## FindEMBREE.cmake
+## Find EMBREE's includes and library
+##
+## This module defines :
+## [in] EMBREE_DIR, The base directory to search for EMBREE (as cmake var or env var)
+## [out] EMBREE_INCLUDE_DIR where to find EMBREE.h
+## [out] EMBREE_LIBRARIES, EMBREE_LIBRARY, libraries to link against to use EMBREE
+## [out] EMBREE_FOUND, If false, do not try to use EMBREE.
+##
+
+
+if(NOT EMBREE_DIR)
+ set(EMBREE_DIR "$ENV{EMBREE_DIR}" CACHE PATH "EMBREE root directory")
+endif()
+if(EMBREE_DIR)
+ file(TO_CMAKE_PATH ${EMBREE_DIR} EMBREE_DIR)
+endif()
+
+
+## set the LIB POSTFIX to find in a right directory according to what kind of compiler we use (32/64bits)
+if(CMAKE_SIZEOF_VOID_P EQUAL 8)
+ set(EMBREE_SEARCH_LIB "lib64")
+ set(EMBREE_SEARCH_BIN "bin64")
+ set(EMBREE_SEARCH_LIB_PATHSUFFIXE "x64")
+else()
+ set(EMBREE_SEARCH_LIB "lib32")
+ set(EMBREE_SEARCH_BIN "bin32")
+ set(EMBREE_SEARCH_LIB_PATHSUFFIXE "x86")
+endif()
+
+set(PROGRAMFILESx86 "PROGRAMFILES(x86)")
+
+FIND_PATH(EMBREE_INCLUDE_DIR
+ NAMES embree3/rtcore_geometry.h
+ PATHS
+ ${EMBREE_DIR}
+ ## linux
+ /usr
+ /usr/local
+ /opt/local
+ ## windows
+ "$ENV{PROGRAMFILES}/EMBREE"
+ "$ENV{${PROGRAMFILESx86}}/EMBREE"
+ "$ENV{ProgramW6432}/EMBREE"
+ PATH_SUFFIXES include
+)
+
+FIND_LIBRARY(EMBREE_LIBRARY
+ NAMES embree3
+ PATHS
+ ${EMBREE_DIR}/${EMBREE_SEARCH_LIB}
+ ${EMBREE_DIR}/lib
+ ## linux
+ /usr/${EMBREE_SEARCH_LIB}
+ /usr/local/${EMBREE_SEARCH_LIB}
+ /opt/local/${EMBREE_SEARCH_LIB}
+ /usr/lib
+ /usr/local/lib
+ /opt/local/lib
+ ## windows
+ "$ENV{PROGRAMFILES}/EMBREE/${EMBREE_SEARCH_LIB}"
+ "$ENV{${PROGRAMFILESx86}}/EMBREE/${EMBREE_SEARCH_LIB}"
+ "$ENV{ProgramW6432}/EMBREE/${EMBREE_SEARCH_LIB}"
+ "$ENV{PROGRAMFILES}/EMBREE/lib"
+ "$ENV{${PROGRAMFILESx86}}/EMBREE/lib"
+ "$ENV{ProgramW6432}/EMBREE/lib"
+ PATH_SUFFIXES ${EMBREE_SEARCH_LIB_PATHSUFFIXE}
+)
+set(EMBREE_LIBRARIES ${EMBREE_LIBRARY})
+
+MARK_AS_ADVANCED(EMBREE_INCLUDE_DIR EMBREE_LIBRARIES)
+
+INCLUDE(FindPackageHandleStandardArgs)
+FIND_PACKAGE_HANDLE_STANDARD_ARGS(EMBREE
+ REQUIRED_VARS EMBREE_INCLUDE_DIR EMBREE_LIBRARIES
+ FAIL_MESSAGE "EMBREE wasn't found correctly. Set EMBREE_DIR to the root SDK installation directory."
+)
+
+if(NOT EMBREE_FOUND)
+ set(EMBREE_DIR "" CACHE STRING "Path to EMBREE install directory")
+endif()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/Modules/FindFFmpeg.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/Modules/FindFFmpeg.cmake
new file mode 100644
index 0000000..5b208b6
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/Modules/FindFFmpeg.cmake
@@ -0,0 +1,104 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+## Try to find the FFMPEG library
+## Once done this will define
+##
+## FFMPEG_FOUND - system has FFmpeg
+## FFMPEG_INCLUDE_DIR - The FFmpeg include directory
+## FFMPEG_LIBRARIES - The libraries needed to use FFmpeg
+## FFMPEG_DYNAMIC_LIBS - DLLs for windows
+
+
+if(NOT FFMPEG_DIR)
+ set(FFMPEG_DIR "$ENV{FFMPEG_DIR}" CACHE PATH "FFMPEG_DIR root directory")
+endif()
+
+if(FFMPEG_DIR)
+ file(TO_CMAKE_PATH ${FFMPEG_DIR} FFMPEG_DIR)
+endif()
+
+MACRO(FFMPEG_FIND varname shortname headername)
+
+ # Path to include dirs
+ FIND_PATH(FFMPEG_${varname}_INCLUDE_DIRS
+ NAMES "lib${shortname}/${headername}"
+ PATHS
+ "${FFMPEG_DIR}/include" # modify this to adapt according to OS/compiler
+ )
+
+ #Add libraries
+ IF(${FFMPEG_${varname}_INCLUDE_DIRS} STREQUAL "FFMPEG_${varname}_INCLUDE_DIR-NOTFOUND")
+ MESSAGE(STATUS "Can't find includes for ${shortname}...")
+ ELSE()
+ FIND_LIBRARY(FFMPEG_${varname}_LIBRARIES
+ NAMES ${shortname}
+ PATHS
+ ${FFMPEG_DIR}/lib
+ )
+
+ # set libraries and other variables
+ SET(FFMPEG_${varname}_FOUND 1)
+ SET(FFMPEG_${varname}_INCLUDE_DIRS ${FFMPEG_${varname}_INCLUDE_DIR})
+ SET(FFMPEG_${varname}_LIBS ${FFMPEG_${varname}_LIBRARIES})
+ ENDIF()
+ ENDMACRO(FFMPEG_FIND)
+
+#Calls to ffmpeg_find to get librarires ------------------------------
+FFMPEG_FIND(LIBAVFORMAT avformat avformat.h)
+FFMPEG_FIND(LIBAVDEVICE avdevice avdevice.h)
+FFMPEG_FIND(LIBAVCODEC avcodec avcodec.h)
+FFMPEG_FIND(LIBAVUTIL avutil avutil.h)
+FFMPEG_FIND(LIBSWSCALE swscale swscale.h)
+
+# check if libs are found and set FFMPEG related variables
+#SET(FFMPEG_FOUND "NO")
+IF(FFMPEG_LIBAVFORMAT_FOUND
+ AND FFMPEG_LIBAVDEVICE_FOUND
+ AND FFMPEG_LIBAVCODEC_FOUND
+ AND FFMPEG_LIBAVUTIL_FOUND
+ AND FFMPEG_LIBSWSCALE_FOUND)
+
+ # All ffmpeg libs are here
+ SET(FFMPEG_FOUND "YES")
+ SET(FFMPEG_INCLUDE_DIR ${FFMPEG_LIBAVFORMAT_INCLUDE_DIRS})
+ SET(FFMPEG_LIBRARY_DIRS ${FFMPEG_LIBAVFORMAT_LIBRARY_DIRS})
+ SET(FFMPEG_LIBRARIES
+ ${FFMPEG_LIBAVFORMAT_LIBS}
+ ${FFMPEG_LIBAVDEVICE_LIBS}
+ ${FFMPEG_LIBAVCODEC_LIBS}
+ ${FFMPEG_LIBAVUTIL_LIBS}
+ ${FFMPEG_LIBSWSCALE_LIBS} )
+
+ # add dynamic libraries
+ if(WIN32)
+ file(GLOB FFMPEG_DYNAMIC_LIBS "${FFMPEG_DIR}/bin/*.dll")
+ if(NOT FFMPEG_DYNAMIC_LIBS)
+ message("FFMPEG_DYNAMIC_LIBS is missing...")
+ endif()
+ set(FFMPEG_DYNAMIC_LIBS ${FFMPEG_DYNAMIC_LIBS} CACHE PATH "Windows dll location")
+endif()
+
+ mark_as_advanced(FFMPEG_INCLUDE_DIR FFMPEG_LIBRARY_DIRS FFMPEG_LIBRARIES FFMPEG_DYNAMIC_LIBS)
+ELSE ()
+ MESSAGE(STATUS "Could not find FFMPEG")
+ENDIF()
+
+
+INCLUDE(FindPackageHandleStandardArgs)
+FIND_PACKAGE_HANDLE_STANDARD_ARGS(FFMPEG
+ REQUIRED_VARS FFMPEG_INCLUDE_DIR FFMPEG_LIBRARIES
+ FAIL_MESSAGE "FFmpeg wasn't found correctly. Set FFMPEG_DIR to the root SDK installation directory."
+)
+
+if(NOT FFMPEG_FOUND)
+ set(FFMPEG_DIR "" CACHE STRING "Path to FFmpeg install directory")
+endif()
+
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/Win3rdParty.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/Win3rdParty.cmake
new file mode 100644
index 0000000..7e42fbb
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/Win3rdParty.cmake
@@ -0,0 +1,337 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+## This file should be include and use only on WIN32 OS and once
+## It allow to auto check/download and use a preconfigured 3rdParty binaries for cmake usage
+## It use the downloadAndExtractZipFile cmake module to work.
+##
+if(__Win3rdParty_cmake_INCLUDED__)
+ return()
+else()
+ set(__Win3rdParty_cmake_INCLUDED__ ON)
+endif()
+
+
+##
+## To be sure to reset an empty cached variable but keep any other kind of variables
+##
+## Usage:
+## check_cached_var( [FORCE])
+##
+## is the cached cmake variable you need to reset
+## is the new default value of the reseted cached cmake variable
+## is the kind of GUI cache input can be : FILEPATH; PATH; STRING or BOOL
+## is the associated GUI cache input documentation display in the GUI
+## FORCE option could be use to reset a cached variable even if it is not empty.
+##
+macro(check_cached_var var resetedCachedValue cacheType cacheDoc)
+ # message(STATUS "inside check_cached_var macro. argn=${ARGN}")
+ cmake_parse_arguments(ccv "FORCE" "" "" ${ARGN})
+
+ if(ccv_FORCE)
+ set(FORCE FORCE)
+ else()
+ set(FORCE )
+ endif()
+
+ if(NOT ${var} OR ccv_FORCE)
+ unset(${var} CACHE)
+ # message(STATUS "setting new cache value. var ${var} = ${resetedCachedValue}")
+ set(${var} "${resetedCachedValue}" CACHE ${cacheType} "${cacheDoc}" ${FORCE})
+ endif()
+endmacro()
+
+
+##
+## Win3rdParty function allow to specify a directory which contain all necessary windows dependenties.
+## By uploading 3rdParty directory (which contain dependencies, *.lib, *.dll... for a specific version of compiler) onto Gforge file tab,
+## you get back an URL of download you can give to this function with a directory name. So you can provide multiple 3rdParty version of same dependencies (MSVC11, MSVC12...).
+## By providing a prefix to this function, you allow to use different kind of 3rdParty which can be handled by CMAKE OPTIONS depending on what your framework need for example.
+##
+## Usage 1:
+## Win3rdParty( MSVC
+## [MSVC] [...]
+## [VCID] [DEFAULT_USE] [VERBOSE] )
+##
+## * allow to identify which 3rdParty you process (prefix name)
+## * MSVC flag could be MSVC11 or MSVC12 (any element of the MSVC_VERSIONS_LIST) and refer to a 3rdParty compiler with :
+## * which will be the local pathName of the downloaded 3rdParty : relative to CMAKE_BINARY_DIR
+## * which is the link location of the 3rdParty zip
+## * VCID flag will make available a cache variable ${prefix}_WIN3RDPARTY_VCID
+## * DEFAULT_USE flag [ON|OFF] may be used to set default value of cmake cached variable : _WIN3RDPARTY_USE [default to ON]
+##
+## WARNING:
+## This function define CACHE variables you can use after :
+## * ${prefix}_WIN3RDPARTY_USE : allow to check/downloaded win3rdParty dir (it will force the cached variables for this dependency folder generally _DIR>)
+## * ${prefix}_WIN3RDPARTY_DIR : where is your local win3rdParty dir (the PATH)
+## * ${prefix}_WIN3RDPARTY_VCID : [if VCID flag is used] the MSVC id (commonly used to prefix/suffix library name, see boost or CGAL)
+##
+## If you want to add a win3rdParty version, please:
+## 1- build dependencies on your local side with the compiler you want
+## 2- build your own zip with your built dependencies
+## 3- upload it (onto the forge where the project is stored) and copy the link location in order to use it for this function
+## 4- if you just introduced a new MSVC version, add it to the MSVC_VERSIONS_LIST bellow
+##
+## In a second pass, you can also use this function to set necessary cmake cached variables in order to let cmake find packages of these 3rdParty.
+##
+## Usage 2:
+## win3rdParty( [VERBOSE] MULTI_SET|SET
+## CHECK_CACHED_VAR [LIST] [DOC ]
+## [ CHECK_CACHED_VAR [LIST] [DOC ] ] [...]
+##
+## * MULTI_SET or SET flags are used to tell cmake that all next arguments will use repeated flags with differents entries (SET mean we will provide only one set of arguments, without repetition)
+## * CHECK_CACHED_VAR are the repeated flag which contain differents entries
+## * is the cmake variable you want to be cached for the project
+## * is the kind of cmake variable (couble be: FILEPATH; PATH; STRING or BOOL) => see check_cached_var.
+## * LIST optional flag could be used with CHECK_CACHED_VAR when = STRING. It allow to handle multiple STRINGS value list.
+## * is the value of the variable (if FILEPATH, PATH or STRING: use quotes, if BOOL : use ON/OFF)
+## * DOC optional flag is used to have a tooltips info about this new cmake variable entry into the GUI (use quotes).
+##
+## Full example 1 :
+## win3rdParty(COMMON MSVC11 "win3rdParty-MSVC11" "https://path.to/an.archive.7z"
+## SET CHECK_CACHED_VAR SuiteSparse_DIR PATH "SuiteSparse-4.2.1" DOC "default empty doc"
+## )
+##
+## WARNING:
+## For the 2nd usage (with MULTI_SET), if you planned to set some CACHED_VAR using/composed by ${prefix}_WIN3RDPARTY_* just set in this macro (usage 1),
+## then (due to the not yet existing var) you will need to call this function 2 times :
+## One for the 1st usage (downloading of the current compiler 3rdParty).
+## One for the MLUTI_SET flag which will use existsing ${prefix}_WIN3RDPARTY_* cached var.
+##
+## Full example 2 :
+## win3rdParty(COMMON MSVC11 "win3rdParty-MSVC11" "https://path.to/an.archive.7z")
+## win3rdParty(COMMON MULTI_SET
+## CHECK_CACHED_VAR CGAL_INCLUDE_DIR PATH "CGAL-4.3/include" DOC "default empty doc"
+## CHECK_CACHED_VAR CGAL_LIBRARIES STRING LIST "debug;CGAL-4.3/lib${LIB_POSTFIX}/CGAL-${WIN3RDPARTY_COMMON_VCID}-mt-gd-4.3.lib;optimized;CGAL-4.3/lib${LIB_POSTFIX}/CGAL-${WIN3RDPARTY_COMMON_VCID}-mt-4.3.lib"
+##
+##
+## WARNING: This function use internaly :
+## * downloadAndExtractZipFile.cmake
+## * parse_arguments_multi.cmake
+## * check_cached_var macro
+##
+function(win3rdParty prefix )
+
+ # ARGV: list of all arguments given to the macro/function
+ # ARGN: list of remaining arguments
+
+ if(NOT WIN32)
+ return()
+ endif()
+
+ ## set the handled version of MSVC
+ ## if you plan to add a win3rdParty dir to download with a new MSVC version: build the win3rdParty dir and add the MSCV entry here.
+ set(MSVC_VERSIONS_LIST "MSVC17;MSVC11;MSVC12;MSVC14")
+
+ #include(CMakeParseArguments) # CMakeParseArguments is obsolete since cmake 3.5
+ # cmake_parse_arguments ( args)
+ # : options (flags) pass to the macro
+ # : options that neeed a value
+ # : options that neeed more than one value
+ cmake_parse_arguments(w3p "VCID" "VERBOSE;TIMEOUT;DEFAULT_USE" "${MSVC_VERSIONS_LIST};MULTI_SET;SET" ${ARGN})
+
+ # message(STATUS "value of w3p_VCID = ${w3p_VCID}")
+ # message(STATUS "value of w3p_VERBOSE = ${w3p_VERBOSE}")
+ # message(STATUS "value of w3p_TIMEOUT = ${w3p_TIMEOUT}")
+ # message(STATUS "value of w3p_DEFAULT_USE = ${w3p_DEFAULT_USE}")
+
+ # foreach (loop_var ${MSVC_VERSIONS_LIST})
+ # message(STATUS "value of w3p_${loop_var} = ${w3p_${loop_var}}")
+ # endforeach(loop_var)
+
+ # message(STATUS "value of w3p_MULTI_SET = ${w3p_MULTI_SET}")
+ # message(STATUS "value of w3p_SET = ${w3p_SET}")
+
+ # message("values for MSVC = ${w3p_MSVC14}")
+
+ if(NOT w3p_TIMEOUT)
+ set(w3p_TIMEOUT 300)
+ endif()
+
+ if(NOT DEFINED w3p_DEFAULT_USE)
+ set(w3p_DEFAULT_USE ON)
+ endif()
+
+
+ ## 1st use (check/update|download) :
+ set(${prefix}_WIN3RDPARTY_USE ${w3p_DEFAULT_USE} CACHE BOOL "Use required 3rdParty binaries from ${prefix}_WIN3RDPARTY_DIR or download it if not exist")
+
+
+ ## We want to test if each version of MSVC was filled by the function (see associated parameters)
+ ## As CMake is running only for one version of MSVC, if that MSVC version was filled, we get back associated parameters,
+ ## otherwise we can't use the downloadAndExtractZipFile with win3rdParty.
+ set(enableWin3rdParty OFF)
+
+ foreach(MSVC_VER ${MSVC_VERSIONS_LIST})
+ if(${MSVC_VER} AND w3p_${MSVC_VER} OR ${MSVC_TOOLSET_VERSION} EQUAL 143 AND ${MSVC_VER} STREQUAL "MSVC17")
+ list(LENGTH w3p_${MSVC_VER} count)
+ if("${count}" LESS "2")
+ #message(WARNING "You are using ${MSVC_VER} with ${prefix}_WIN3RDPARTY_USE=${${prefix}_WIN3RDPARTY_USE}, but win3rdParty function isn't filled for ${MSVC_VER}!")
+ else()
+ list(GET w3p_${MSVC_VER} 0 Win3rdPartyName)
+ list(GET w3p_${MSVC_VER} 1 Win3rdPartyUrl)
+ if(w3p_VCID)
+ ## try to get the VcId of MSVC. See also MSVC_VERSION cmake var in the doc.
+ string(REGEX REPLACE "MS([A-Za-z_0-9-]+)" "\\1" vcId ${MSVC_VER})
+ string(TOLOWER ${vcId} vcId)
+ set(${prefix}_WIN3RDPARTY_VCID "${vcId}0" CACHE STRING "the MSVC id (commonly used to prefix/suffix library name, see boost or CGAL)")
+ mark_as_advanced(${prefix}_WIN3RDPARTY_VCID)
+ endif()
+ set(enableWin3rdParty ON)
+ set(suffixCompilerID ${MSVC_VER})
+ break()
+ endif()
+ endif()
+ endforeach()
+ ## If previous step succeed to get MSVC dirname and URL of the current MSVC version, use it to auto download/update the win3rdParty dir
+ if(enableWin3rdParty AND ${prefix}_WIN3RDPARTY_USE)
+
+ if(IS_ABSOLUTE "${Win3rdPartyName}")
+ else()
+ set(Win3rdPartyName "${CMAKE_BINARY_DIR}/${Win3rdPartyName}")
+ endif()
+
+ if(NOT EXISTS "${Win3rdPartyName}")
+ file(MAKE_DIRECTORY ${Win3rdPartyName})
+ endif()
+
+ include(downloadAndExtractZipFile)
+ downloadAndExtractZipFile( "${Win3rdPartyUrl}" ## URL link location
+ "Win3rdParty-${prefix}-${suffixCompilerID}.7z" ## where download it: relative path, so default to CMAKE_BINARY_DIR
+ "${Win3rdPartyName}" ## where extract it : fullPath (default relative to CMAKE_BINARY_DIR)
+ CHECK_DIRTY_URL "${Win3rdPartyName}/Win3rdPartyUrl" ## last downloaded url file : fullPath (default relative to CMAKE_BINARY_DIR)
+ TIMEOUT ${w3p_TIMEOUT}
+ VERBOSE ${w3p_VERBOSE}
+ )
+ file(GLOB checkDl "${Win3rdPartyName}/*")
+ list(LENGTH checkDl checkDlCount)
+ if("${checkDlCount}" GREATER "1")
+ else()
+ message("The downloadAndExtractZipFile didn't work...?")
+ set(enableWin3rdParty OFF)
+ endif()
+ endif()
+
+ ## Try to auto set ${prefix}_WIN3RDPARTY_DIR or let user set it manually
+ set(${prefix}_WIN3RDPARTY_DIR "" CACHE PATH "windows ${Win3rdPartyName} dir to ${prefix} dependencies of the project")
+
+ if(NOT ${prefix}_WIN3RDPARTY_DIR AND ${prefix}_WIN3RDPARTY_USE)
+ if(EXISTS "${Win3rdPartyName}")
+ unset(${prefix}_WIN3RDPARTY_DIR CACHE)
+ set(${prefix}_WIN3RDPARTY_DIR "${Win3rdPartyName}" CACHE PATH "dir to ${prefix} dependencies of the project")
+ endif()
+ endif()
+
+ if(EXISTS ${${prefix}_WIN3RDPARTY_DIR})
+ message(STATUS "Found a 3rdParty ${prefix} dir : ${${prefix}_WIN3RDPARTY_DIR}.")
+ set(enableWin3rdParty ON)
+ elseif(${prefix}_WIN3RDPARTY_USE)
+ message(WARNING "${prefix}_WIN3RDPARTY_USE=${${prefix}_WIN3RDPARTY_USE} but ${prefix}_WIN3RDPARTY_DIR=${${prefix}_WIN3RDPARTY_DIR}.")
+ set(enableWin3rdParty OFF)
+ endif()
+
+ ## Final check
+ if(NOT enableWin3rdParty)
+ message("Disable ${prefix}_WIN3RDPARTY_USE (cmake cached var will be not set), due to a win3rdParty problem.")
+ message("You still can set ${prefix}_WIN3RDPARTY_DIR to an already downloaded Win3rdParty directory location.")
+ set(${prefix}_WIN3RDPARTY_USE OFF CACHE BOOL "Use required 3rdParty binaries from ${prefix}_WIN3RDPARTY_DIR or download it if not exist" FORCE)
+ endif()
+
+ ## 2nd use : handle multi values args to set cached cmake variables in order to ease the next find_package call
+ if(${prefix}_WIN3RDPARTY_USE AND ${prefix}_WIN3RDPARTY_DIR)
+ if(w3p_VERBOSE)
+ message(STATUS "Try to set cmake cached variables for ${prefix} required libraries directly from : ${${prefix}_WIN3RDPARTY_DIR}.")
+ endif()
+
+ include(parse_arguments_multi)
+ # message (STATUS "before defining an override of parse_arguments_multi_function")
+ function(parse_arguments_multi_function ) ## overloaded function to handle all CHECK_CACHED_VAR values list (see: parse_arguments_multi)
+ # message(STATUS "inside overloaded parse_arguments_multi_function defined in Win3rdParty.cmake")
+ # message(STATUS ${ARGN})
+ ## we know the function take 3 args : var cacheType resetedCachedValue (see check_cached_var)
+ cmake_parse_arguments(pamf "" "DOC" "LIST" ${ARGN})
+
+ ## var and cacheType are mandatory (with the resetedCachedValue)
+ set(var ${ARGV0})
+ set(cacheType ${ARGV1})
+ # message(STATUS "var=${var} and cacheType=${cacheType} list=${pamf_LIST}")
+ if(pamf_DOC)
+ set(cacheDoc ${pamf_DOC})
+ else()
+ set(cacheDoc "")
+ endif()
+ if(pamf_LIST)
+ set(value ${pamf_LIST})
+ else()
+ # message("USING ARGV2 with value ${ARGV2}")
+ set(value ${ARGV2})
+ endif()
+ # message("inside override function in Win3rdparty.cmake value+ ${value}")
+ if("${cacheType}" MATCHES "PATH" AND EXISTS "${${prefix}_WIN3RDPARTY_DIR}/${value}")
+ # message("math with path")
+ set(resetedCachedValue "${${prefix}_WIN3RDPARTY_DIR}/${value}") ## path relative to ${prefix}_WIN3RDPARTY_DIR
+ elseif ("${cacheType}" MATCHES "PATH" AND EXISTS "${${prefix}_WIN3RDPARTY_DIR}")
+ set(resetedCachedValue "${${prefix}_WIN3RDPARTY_DIR}") ## path relative to ${prefix}_WIN3RDPARTY_DIR
+ elseif("${cacheType}" MATCHES "STRING")
+ foreach(var IN LISTS value)
+ if(EXISTS "${${prefix}_WIN3RDPARTY_DIR}/${var}")
+ list(APPEND resetedCachedValue "${${prefix}_WIN3RDPARTY_DIR}/${var}") ## string item of the string list is a path => make relative to ${prefix}_WIN3RDPARTY_DIR
+ else()
+ list(APPEND resetedCachedValue ${var}) ## string item of the string list is not an existing path => simply use the item
+ endif()
+ endforeach()
+ else()
+ set(resetedCachedValue "${value}") ## could be a BOOL or a STRING
+ endif()
+
+ ## call our macro to reset cmake cache variable if empty
+ check_cached_var(${var} "${resetedCachedValue}" ${cacheType} "${cacheDoc}" FORCE)
+
+ endfunction()
+ # message (STATUS "after defining an override of parse_arguments_multi_function")
+
+ if(w3p_MULTI_SET)
+ parse_arguments_multi(CHECK_CACHED_VAR w3p_MULTI_SET ${w3p_MULTI_SET}) ## internaly will call our overloaded parse_arguments_multi_function
+ elseif(w3p_SET)
+ # message("calling set version of parse_arguments_multi with w3p_set = ${w3p_SET}")
+ parse_arguments_multi(CHECK_CACHED_VAR w3p_SET ${w3p_SET})
+ endif()
+
+ endif()
+
+endfunction()
+
+## cmake variables introspection to globally activate/deactivate ${prefix}_WIN3RDPARTY_USE
+## This "one shot" call (only one for the next cmake configure) will automatically then reset the global variable WIN3RDPARTY_USE to UserDefined (do nothing).
+## use (call it) before and after the call of all your win3rdParty functions
+function(Win3rdPartyGlobalCacheAction )
+ set(WIN3RDPARTY_USE "UserDefined" CACHE STRING "Choose how to handle all cmake cached *_WIN3RDPARTY_USE for the next configure.\nCould be:\nUserDefined [default]\nActivateAll\nDesactivateAll" )
+ set_property(CACHE WIN3RDPARTY_USE PROPERTY STRINGS "UserDefined;ActivateAll;DesactivateAll" )
+ if(${WIN3RDPARTY_USE} MATCHES "UserDefined")
+ else()
+ if(${WIN3RDPARTY_USE} MATCHES "ActivateAll")
+ set(win3rdPvalue ON)
+ elseif(${WIN3RDPARTY_USE} MATCHES "DesactivateAll")
+ set(win3rdPvalue OFF)
+ endif()
+ get_cmake_property(_variableNames CACHE_VARIABLES)
+ foreach (_variableName ${_variableNames})
+ string(REGEX MATCH "[A-Za-z_0-9-]+_WIN3RDPARTY_USE" win3rdpartyUseCacheVar ${_variableName})
+ if(win3rdpartyUseCacheVar)
+ string(REGEX REPLACE "([A-Za-z_0-9-]+_WIN3RDPARTY_USE)" "\\1" win3rdpartyUseCacheVar ${_variableName})
+ set(${win3rdpartyUseCacheVar} ${win3rdPvalue} CACHE BOOL "Use required 3rdParty binaries from ${prefix}_WIN3RDPARTY_DIR or download it if not exist" FORCE)
+ message(STATUS "${win3rdpartyUseCacheVar} cached variable set to ${win3rdPvalue}.")
+ endif()
+ endforeach()
+ set(WIN3RDPARTY_USE "UserDefined" CACHE STRING "Choose how to handle all cmake cached *_WIN3RDPARTY_USE for the next configure.\nCould be:\nUserDefined [default]\nActivateAll\nDesactivateAll" FORCE)
+ message(STATUS "reset WIN3RDPARTY_USE to UserDefined.")
+ endif()
+ mark_as_advanced(WIN3RDPARTY_USE)
+endfunction()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/cmake_policies.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/cmake_policies.cmake
new file mode 100644
index 0000000..679fd84
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/cmake_policies.cmake
@@ -0,0 +1,19 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+if(__set_policies_INCLUDED__)
+ return()
+else()
+ set(__set_policies_INCLUDED__ ON)
+endif()
+
+macro(setPolicies)
+ # No more policies to enforce
+endmacro()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/dependencies.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/dependencies.cmake
new file mode 100644
index 0000000..d6d4e2d
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/dependencies.cmake
@@ -0,0 +1,292 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+## Included once for all sub project.
+## It contain the whole cmake instructions to find necessary common dependencies.
+## 3rdParty (provided by sibr_addlibrary win3rdParty or from external packages) are then available in cmake sub projects.
+##
+## Do not include this file more than once but you can modify it to fit to your own project.
+## So please, read it carefully because you can use on of these dependencies for your project or appen new one.
+##
+## As it is included after camke options, you can use conditional if()/endif() to encapsulate your 3rdParty.
+##
+
+## win3rdParty function allowing to auto check/download/update binaries dependencies for current windows compiler
+## Please open this file in order to get more documentation and usage examples.
+include(Win3rdParty)
+
+include(sibr_library)
+
+Win3rdPartyGlobalCacheAction()
+
+find_package(OpenGL REQUIRED)
+
+############
+## Find GLEW
+############
+if (MSVC11 OR MSVC12)
+ set(glew_multiset_arguments
+ CHECK_CACHED_VAR GLEW_INCLUDE_DIR PATH "glew-1.10.0/include" DOC "default empty doc"
+ CHECK_CACHED_VAR GLEW_LIBRARIES STRING LIST "debug;glew-1.10.0/${LIB_BUILT_DIR}/glew32d.lib;optimized;glew-1.10.0/${LIB_BUILT_DIR}/glew32.lib" DOC "default empty doc"
+ )
+elseif (MSVC14)
+ set(glew_multiset_arguments
+ CHECK_CACHED_VAR GLEW_INCLUDE_DIR PATH "glew-2.0.0/include" DOC "default empty doc"
+ CHECK_CACHED_VAR GLEW_SHARED_LIBRARY_RELEASE PATH "glew-2.0.0/${LIB_BUILT_DIR}/glew32.lib"
+ CHECK_CACHED_VAR GLEW_STATIC_LIBRARY_RELEASE PATH "glew-2.0.0/${LIB_BUILT_DIR}/glew32s.lib"
+ CHECK_CACHED_VAR GLEW_SHARED_LIBRARY_DEBUG PATH "glew-2.0.0/${LIB_BUILT_DIR}/glew32d.lib"
+ CHECK_CACHED_VAR GLEW_STATIC_LIBRARY_DEBUG PATH "glew-2.0.0/${LIB_BUILT_DIR}/glew32sd.lib"
+ )
+else ()
+ message("There is no provided GLEW library for your version of MSVC")
+endif()
+sibr_addlibrary(NAME GLEW #VERBOSE ON
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/glew-1.10.0.7z"
+ MSVC12 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/glew-1.10.0.7z"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/glew-2.0.0.7z" # using recompiled version of glew
+ MULTI_SET ${glew_multiset_arguments}
+)
+set(GLEW_VERBOSE ON)
+FIND_PACKAGE(GLEW REQUIRED)
+IF(GLEW_FOUND)
+ INCLUDE_DIRECTORIES(${GLEW_INCLUDE_DIR})
+ELSE(GLEW_FOUND)
+ MESSAGE("GLEW not found. Set GLEW_DIR to base directory of GLEW.")
+ENDIF(GLEW_FOUND)
+
+
+##############
+## Find ASSIMP
+##############
+if (MSVC11 OR MSVC12)
+ set(assimp_set_arguments
+ CHECK_CACHED_VAR ASSIMP_DIR PATH "Assimp_3.1_fix"
+ )
+elseif (MSVC14)
+ set(assimp_set_arguments
+ CHECK_CACHED_VAR ASSIMP_DIR PATH "Assimp-4.1.0"
+ )
+else ()
+ message("There is no provided ASSIMP library for your version of MSVC")
+endif()
+
+sibr_addlibrary(NAME ASSIMP #VERBOSE ON
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/Assimp_3.1_fix.7z"
+ MSVC12 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/Assimp_3.1_fix.7z"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/Assimp-4.1.0.7z"
+ MULTI_SET
+ ${assimp_set_arguments}
+)
+
+find_package(ASSIMP REQUIRED)
+include_directories(${ASSIMP_INCLUDE_DIR})
+
+################
+## Find FFMPEG
+################
+sibr_addlibrary(NAME FFMPEG
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/ffmpeg.zip"
+ MSVC12 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/ffmpeg.zip"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/ffmpeg-4.0.2-win64-win3rdParty.7z"
+ SET CHECK_CACHED_VAR FFMPEG_DIR PATH ${FFMPEG_WIN3RDPARTY_DIR}
+)
+find_package(FFMPEG QUIET)
+include_directories(${FFMPEG_INCLUDE_DIR})
+
+###################
+## Find embree3
+###################
+sibr_addlibrary(
+ NAME embree3
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/embree2.7.0.x64.windows.7z"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/embree-3.6.1.x64.vc14.windows.7z" # TODO SV: provide a valid version if required
+)
+
+###################
+## Find eigen3
+###################
+sibr_addlibrary(
+ NAME eigen3
+ #MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/eigen-eigen-dc6cfdf9bcec.7z"
+ #MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/eigen-eigen-dc6cfdf9bcec.7z" # TODO SV: provide a valid version if required
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/eigen3.7z"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/eigen3.7z"
+ SET CHECK_CACHED_VAR eigen3_DIR PATH "eigen/share/eigen3/cmake"
+)
+include_directories(/usr/include/eigen3)
+add_definitions(-DEIGEN_INITIALIZE_MATRICES_BY_ZERO)
+
+#############
+## Find Boost
+#############
+set(Boost_REQUIRED_COMPONENTS "system;chrono;filesystem;date_time" CACHE INTERNAL "Boost Required Components")
+
+if (WIN32)
+ # boost multiset arguments
+ if (MSVC11 OR MSVC12)
+ set(boost_multiset_arguments
+ CHECK_CACHED_VAR BOOST_ROOT PATH "boost_1_55_0"
+ CHECK_CACHED_VAR BOOST_INCLUDEDIR PATH "boost_1_55_0"
+ CHECK_CACHED_VAR BOOST_LIBRARYDIR PATH "boost_1_55_0/${LIB_BUILT_DIR}"
+ #CHECK_CACHED_VAR Boost_COMPILER STRING "-${Boost_WIN3RDPARTY_VCID}" DOC "vcid (eg: -vc110 for MSVC11)"
+ CHECK_CACHED_VAR Boost_COMPILER STRING "-vc110" DOC "vcid (eg: -vc110 for MSVC11)" # NOTE: if it doesnt work, uncomment this option and set the right value for VisualC id
+ )
+ elseif (MSVC14)
+ set(boost_multiset_arguments
+ CHECK_CACHED_VAR BOOST_ROOT PATH "boost-1.71"
+ CHECK_CACHED_VAR BOOST_INCLUDEDIR PATH "boost-1.71"
+ CHECK_CACHED_VAR BOOST_LIBRARYDIR PATH "boost-1.71/${LIB_BUILT_DIR}"
+ CHECK_CACHED_VAR Boost_COMPILER STRING "-vc141" DOC "vcid (eg: -vc110 for MSVC11)" # NOTE: if it doesnt work, uncomment this option and set the right value for VisualC id
+ )
+
+ option(BOOST_MINIMAL_VERSION "Only get minimal Boost dependencies" ON)
+
+ if(${BOOST_MINIMAL_VERSION})
+ set(BOOST_MSVC14_ZIP "boost-1.71-ibr-minimal.7z")
+ else()
+ set(BOOST_MSVC14_ZIP "boost-1.71.7z")
+ endif()
+ else ()
+ message("There is no provided Boost library for your version of MSVC")
+ endif()
+
+ sibr_addlibrary(NAME Boost VCID TIMEOUT 600 #VERBOSE ON
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/boost_1_55_0.7z"
+ MSVC12 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC11-splitted%20version/boost_1_55_0.7z"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/${BOOST_MSVC14_ZIP}" # boost compatible with msvc14
+ MULTI_SET ${boost_multiset_arguments}
+ CHECK_CACHED_VAR Boost_NO_SYSTEM_PATHS BOOL ON DOC "Set to ON to disable searching in locations not specified by these boost cached hint variables"
+ CHECK_CACHED_VAR Boost_NO_BOOST_CMAKE BOOL ON DOC "Set to ON to disable the search for boost-cmake (package cmake config file if boost was built with cmake)"
+ )
+ if(NOT Boost_COMPILER AND Boost_WIN3RDPARTY_USE)
+ message(WARNING "Boost_COMPILER is not set and it's needed.")
+ endif()
+endif()
+
+find_package(Boost 1.71.0 REQUIRED COMPONENTS ${Boost_REQUIRED_COMPONENTS})
+
+if(WIN32)
+ add_compile_options("$<$:/EHsc>")
+ #add_definitions(/EHsc)
+endif()
+
+if(Boost_LIB_DIAGNOSTIC_DEFINITIONS)
+ add_definitions(${Boost_LIB_DIAGNOSTIC_DEFINITIONS})
+endif()
+
+#if(WIN32)
+ add_definitions(-DBOOST_ALL_DYN_LINK -DBOOST_ALL_NO_LIB)
+#endif()
+
+include_directories(${BOOST_INCLUDEDIR} ${Boost_INCLUDE_DIRS})
+link_directories(${BOOST_LIBRARYDIR} ${Boost_LIBRARY_DIRS})
+
+
+##############
+## Find OpenMP
+##############
+find_package(OpenMP)
+
+sibr_addlibrary(
+ NAME NativeFileDialog
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/sibr/~0.9/nfd.7z"
+)
+
+##############
+## Find OpenCV
+##############
+if (WIN32)
+ if (${MSVC_TOOLSET_VERSION} EQUAL 143)
+ MESSAGE("SPECIAL OPENCV HANDLING")
+ set(opencv_set_arguments
+ CHECK_CACHED_VAR OpenCV_DIR PATH "install" ## see OpenCVConfig.cmake
+ )
+ elseif (MSVC11 OR MSVC12)
+ set(opencv_set_arguments
+ CHECK_CACHED_VAR OpenCV_DIR PATH "opencv/build" ## see OpenCVConfig.cmake
+ )
+ elseif (MSVC14)
+ set(opencv_set_arguments
+ CHECK_CACHED_VAR OpenCV_DIR PATH "opencv-4.5.0/build" ## see OpenCVConfig.cmake
+ )
+ else ()
+ message("There is no provided OpenCV library for your compiler, relying on find_package to find it")
+ endif()
+else()
+ message("There is no provided OpenCV library for your compiler, relying on find_package to find it")
+endif()
+
+sibr_addlibrary(NAME OpenCV #VERBOSE ON
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/sibr/~0.9/opencv.7z"
+ MSVC12 "https://repo-sam.inria.fr/fungraph/dependencies/sibr/~0.9/opencv.7z"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/sibr/~0.9/opencv-4.5.0.7z" # opencv compatible with msvc14 and with contribs
+ MSVC17 "https://repo-sam.inria.fr/fungraph/dependencies/sibr/~0.9/opencv4-8.7z"
+ SET ${opencv_set_arguments}
+ )
+find_package(OpenCV REQUIRED) ## Use directly the OpenCVConfig.cmake provided
+
+ ##https://stackoverflow.com/questions/24262081/cmake-relwithdebinfo-links-to-debug-libs
+set_target_properties(${OpenCV_LIBS} PROPERTIES MAP_IMPORTED_CONFIG_RELWITHDEBINFO RELEASE)
+
+add_definitions(-DOPENCV_TRAITS_ENABLE_DEPRECATED)
+
+if(OpenCV_INCLUDE_DIRS)
+ foreach(inc ${OpenCV_INCLUDE_DIRS})
+ if(NOT EXISTS ${inc})
+ set(OpenCV_INCLUDE_DIR "" CACHE PATH "additional custom include DIR (in case of trouble to find it (fedora 17 opencv package))")
+ endif()
+ endforeach()
+ if(OpenCV_INCLUDE_DIR)
+ list(APPEND OpenCV_INCLUDE_DIRS ${OpenCV_INCLUDE_DIR})
+ include_directories(${OpenCV_INCLUDE_DIRS})
+ endif()
+endif()
+
+###################
+## Find GLFW
+###################
+sibr_addlibrary(
+ NAME GLFW
+ MSVC11 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/glfw-3.2.1.7z"
+ MSVC14 "https://repo-sam.inria.fr/fungraph/dependencies/ibr-common/win3rdParty-MSVC15-splitted%20version/glfw-3.2.1.7z" # TODO SV: provide a valid version if required
+)
+
+sibr_gitlibrary(TARGET imgui
+ GIT_REPOSITORY "https://gitlab.inria.fr/sibr/libs/imgui.git"
+ GIT_TAG "e7f0fa31b9fa3ee4ecd2620b9951f131b4e377c6"
+)
+
+sibr_gitlibrary(TARGET mrf
+ GIT_REPOSITORY "https://gitlab.inria.fr/sibr/libs/mrf.git"
+ GIT_TAG "564e5e0b395c788d2f8b2cf4f879fed2493faea7"
+)
+
+sibr_gitlibrary(TARGET nanoflann
+ GIT_REPOSITORY "https://gitlab.inria.fr/sibr/libs/nanoflann.git"
+ GIT_TAG "7a20a9ac0a1d34850fc3a9e398fc4a7618e8a69a"
+)
+
+sibr_gitlibrary(TARGET picojson
+ GIT_REPOSITORY "https://gitlab.inria.fr/sibr/libs/picojson.git"
+ GIT_TAG "7cf8feee93c8383dddbcb6b64cf40b04e007c49f"
+)
+
+sibr_gitlibrary(TARGET rapidxml
+ GIT_REPOSITORY "https://gitlab.inria.fr/sibr/libs/rapidxml.git"
+ GIT_TAG "069e87f5ec5ce1745253bd64d89644d6b894e516"
+)
+
+sibr_gitlibrary(TARGET xatlas
+ GIT_REPOSITORY "https://gitlab.inria.fr/sibr/libs/xatlas.git"
+ GIT_TAG "0fbe06a5368da13fcdc3ee48d4bdb2919ed2a249"
+ INCLUDE_DIRS "source/xatlas"
+)
+
+Win3rdPartyGlobalCacheAction()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/downloadAndExtractZipFile.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/downloadAndExtractZipFile.cmake
new file mode 100644
index 0000000..7f5fc2b
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/downloadAndExtractZipFile.cmake
@@ -0,0 +1,243 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+## downloadAndExtractZipFile cmake function
+## Provide a way to download zip file from public internet ZIP_URL host
+## and to extract it in a specific EXCTRATED_ZIP_PATH destination.
+## This function use 7-Zip external tool to maximize the compatibles formats.
+## This will be not download again if the EXCTRATED_ZIP_PATH already exist and DL_FORCE is set to OFF.
+## This will try to unzip file if already exist in the ZIP_DL_PATH.
+##
+## If EXCTRATED_ZIP_PATH and/or ZIP_DL_PATH are not full path,
+## it will be interpreted relative to CMAKE_BINARY_DIR
+##
+## Usage example :
+## include(downloadAndExtractZipFile)
+## downloadAndExtractZipFile(
+## http://www.cs.cornell.edu/~snavely/bundler/distr/bundler-v0.4-source.zip
+## ${CMAKE_BINARY_DIR}/Bundler/bundler-v0.4-source.zip
+## ${CMAKE_BINARY_DIR}/Bundler
+## [DL_FORCE ON|OFF]
+## [TIMEOUT]
+## [CHECK_DIRTY_URL]
+## )
+##
+## option DL_FORCE will redownload the zip file [deafult to OFF]
+## option TIMEOUT will end the unzip process after this period of time [default to 600s]
+## option CHECK_DIRTY_URL will write into the given file the downloaded URL and then,
+## next time, if the URL was updated, it detect it with this file
+## and will download the last version. This prevent to alway set manually DL_FORCE to ON...
+##
+if(__downloadAndExtractZipFile_cmake_INCLUDED__)
+ return()
+else()
+ set(__downloadAndExtractZipFile_cmake_INCLUDED__ ON)
+endif()
+
+function(downloadAndExtractZipFile ZIP_URL ZIP_DL_PATH EXCTRATED_ZIP_PATH)
+
+ # message(STATUS "zipUrl=${ZIP_URL} zipDlPath=${ZIP_DL_PATH} extractedZipPath=${EXCTRATED_ZIP_PATH}")
+ cmake_parse_arguments(dwnlezf "" "VERBOSE;DL_FORCE;TIMEOUT;CHECK_DIRTY_URL" "" ${ARGN})
+
+ set(PROGRAMFILESx86 "PROGRAMFILES(x86)")
+
+ ## Check entries mandatory args
+ if(IS_ABSOLUTE "${ZIP_DL_PATH}")
+ else()
+ set(ZIP_DL_PATH "${CMAKE_BINARY_DIR}/${ZIP_DL_PATH}")
+ endif()
+ if(IS_ABSOLUTE "${EXCTRATED_ZIP_PATH}")
+ else()
+ set(EXCTRATED_ZIP_PATH "${CMAKE_BINARY_DIR}/${EXCTRATED_ZIP_PATH}")
+ endif()
+ if(NOT EXISTS "${EXCTRATED_ZIP_PATH}")
+ file(MAKE_DIRECTORY ${EXCTRATED_ZIP_PATH})
+ endif()
+
+ # SB: Once, one of downloaded zip was corrupted by an error message coming from the server.
+ if(EXISTS "${ZIP_DL_PATH}")
+ # So I check for removing such corrupted files
+ message("Removing previous ${ZIP_DL_PATH} (might be corrupted)")
+ file(REMOVE "${ZIP_DL_PATH}")
+ if(EXISTS "${dwnlezf_CHECK_DIRTY_URL}")
+ # and remove the previous (corrupted) made 'Win3rdPartyUrl' file
+ file(REMOVE "${dwnlezf_CHECK_DIRTY_URL}")
+ endif()
+ endif()
+
+ ## Check entries optional args
+ macro(readDirtyUrl )
+ if(dwnlezf_CHECK_DIRTY_URL)
+ if(IS_ABSOLUTE "${dwnlezf_CHECK_DIRTY_URL}")
+ else()
+ set(dwnlezf_CHECK_DIRTY_URL "${CMAKE_BINARY_DIR}/${dwnlezf_CHECK_DIRTY_URL}")
+ endif()
+ get_filename_component(unzipDir ${EXCTRATED_ZIP_PATH} NAME)
+ get_filename_component(unzipPath ${EXCTRATED_ZIP_PATH} PATH)
+ message(STATUS "Checking ${unzipDir} [from ${unzipPath}]...")
+ if(EXISTS "${dwnlezf_CHECK_DIRTY_URL}")
+ get_filename_component(CHECK_DIRTY_URL_FILENAME ${dwnlezf_CHECK_DIRTY_URL} NAME)
+ file(STRINGS "${dwnlezf_CHECK_DIRTY_URL}" contents)
+ list(GET contents 0 downloadURL)
+ list(REMOVE_AT contents 0)
+ if("${downloadURL}" MATCHES "${ZIP_URL}")
+ if(dwnlezf_VERBOSE)
+ message(STATUS "Your downloaded version (URL) seems to be up to date. Let me check if nothing is missing... (see ${dwnlezf_CHECK_DIRTY_URL}).")
+ endif()
+ file(GLOB PATHNAME_PATTERN_LIST "${EXCTRATED_ZIP_PATH}/*") ## is there something inside the downloaded destination ?
+ unset(NAME_PATTERN_LIST)
+ foreach(realPathPattern ${PATHNAME_PATTERN_LIST})
+ get_filename_component(itemName ${realPathPattern} NAME)
+ list(APPEND NAME_PATTERN_LIST ${itemName})
+ endforeach()
+ if(NAME_PATTERN_LIST)
+ foreach(item ${contents})
+ list(FIND NAME_PATTERN_LIST ${item} id)
+ if(${id} MATCHES "-1")
+ message(STATUS "${item} is missing, your downloaded version content changed, need to redownload it.")
+ set(ZIP_DL_FORCE ON)
+ break()
+ else()
+ list(REMOVE_AT NAME_PATTERN_LIST ${id})
+ set(ZIP_DL_FORCE OFF)
+ endif()
+ endforeach()
+ if(NOT ZIP_DL_FORCE AND NAME_PATTERN_LIST)
+ message("Yours seems to be up to date (regarding to ${CHECK_DIRTY_URL_FILENAME})!\nBut there are additional files/folders into your downloaded destination (feel free to clean it if you want).")
+ foreach(item ${NAME_PATTERN_LIST})
+ if(item)
+ message("${item}")
+ endif()
+ endforeach()
+ endif()
+ endif()
+ else()
+ set(ZIP_DL_FORCE ON)
+ message(STATUS "Your downloaded version is dirty (too old).")
+ endif()
+ else()
+ file(GLOB PATHNAME_PATTERN_LIST "${EXCTRATED_ZIP_PATH}/*") ## is there something inside the downloaded destination ?
+ if(NOT PATHNAME_PATTERN_LIST)
+ message("We found nothing into ${EXCTRATED_ZIP_PATH}, we will try to download it for you now.")
+ endif()
+ set(ZIP_DL_FORCE ON)
+ endif()
+ endif()
+ endmacro()
+ readDirtyUrl()
+ if(NOT ZIP_DL_FORCE)
+ return() ## do not need to further (as we are up to date, just exit the function
+ endif()
+
+ macro(writeDirtyUrl )
+ if(dwnlezf_CHECK_DIRTY_URL)
+ file(WRITE "${dwnlezf_CHECK_DIRTY_URL}" "${ZIP_URL}\n")
+ file(GLOB PATHNAME_PATTERN_LIST "${EXCTRATED_ZIP_PATH}/*") ## is there something inside the downloaded destination ?
+ unset(NAME_PATTERN_LIST)
+ foreach(realPathPattern ${PATHNAME_PATTERN_LIST})
+ get_filename_component(itemName ${realPathPattern} NAME)
+ list(APPEND NAME_PATTERN_LIST ${itemName})
+ endforeach()
+ if(NAME_PATTERN_LIST)
+ foreach(item ${NAME_PATTERN_LIST})
+ file(APPEND "${dwnlezf_CHECK_DIRTY_URL}" "${item}\n")
+ endforeach()
+ endif()
+ endif()
+ endmacro()
+
+ if(dwnlezf_DL_FORCE)
+ set(ZIP_DL_FORCE ON)
+ endif()
+
+ if(NOT dwnlezf_TIMEOUT)
+ set(dwnlezf_TIMEOUT 600)
+ endif()
+ math(EXPR dwnlezf_TIMEOUT_MIN "${dwnlezf_TIMEOUT}/60")
+
+ macro(unzip whichZipFile)
+ if(NOT SEVEN_ZIP_CMD)
+ find_program(SEVEN_ZIP_CMD NAMES 7z 7za p7zip DOC "7-zip executable" PATHS "$ENV{PROGRAMFILES}/7-Zip" "$ENV{${PROGRAMFILESx86}}/7-Zip" "$ENV{ProgramW6432}/7-Zip")
+ endif()
+ if(SEVEN_ZIP_CMD)
+ if(dwnlezf_VERBOSE)
+ message(STATUS "UNZIP: please, WAIT UNTIL ${SEVEN_ZIP_CMD} finished...\n(no more than ${dwnlezf_TIMEOUT_MIN} min)")
+ else()
+ message(STATUS "UNZIP...wait...")
+ endif()
+ execute_process( COMMAND ${SEVEN_ZIP_CMD} x ${whichZipFile} -y
+ WORKING_DIRECTORY ${EXCTRATED_ZIP_PATH} TIMEOUT ${dwnlezf_TIMEOUT}
+ RESULT_VARIABLE resVar OUTPUT_VARIABLE outVar ERROR_VARIABLE errVar
+ )
+ if(${resVar} MATCHES "0")
+ if(dwnlezf_VERBOSE)
+ message(STATUS "SUCESS to unzip in ${EXCTRATED_ZIP_PATH}. Now we can remove the downloaded zip file.")
+ endif()
+ execute_process(COMMAND ${CMAKE_COMMAND} -E remove ${whichZipFile})
+ mark_as_advanced(SEVEN_ZIP_CMD)
+ else()
+ message(WARNING "something wrong in ${EXCTRATED_ZIP_PATH}\n with \"${SEVEN_ZIP_CMD} x ${whichZipFile} -y\", redo or try to unzip by yourself...")
+ message("unzip: resVar=${resVar}")
+ message("unzip: outVar=${outVar}")
+ message("unzip: errVar=${errVar}")
+ message("unzip: failed or canceled or timeout")
+ endif()
+ else()
+ message(WARNING "You need 7zip (http://www.7-zip.org/download.html) to unzip the downloaded dir.")
+ set(SEVEN_ZIP_CMD "" CACHE FILEPATH "7-zip executable")
+ mark_as_advanced(CLEAR SEVEN_ZIP_CMD)
+ endif()
+ endmacro()
+
+ if(dwnlezf_VERBOSE)
+ message(STATUS "Trying to look ${ZIP_DL_PATH} if a zip file exist...")
+ endif()
+ if(EXISTS "${ZIP_DL_PATH}")
+
+ ## already downloaded, so just unzip it
+ unzip(${ZIP_DL_PATH})
+ writeDirtyUrl()
+
+ elseif(ZIP_DL_FORCE)
+
+ ## the download part (+ unzip)
+ message(STATUS "Let me try to download package for you : ${ZIP_URL}")
+ if(dwnlezf_VERBOSE)
+ message(STATUS "Downloading...\n SRC=${ZIP_URL}\n DEST=${ZIP_DL_PATH}.tmp\n INACTIVITY_TIMEOUT=180s")
+ endif()
+ file(DOWNLOAD ${ZIP_URL} ${ZIP_DL_PATH}.tmp INACTIVITY_TIMEOUT 360 STATUS status SHOW_PROGRESS)
+
+ list(GET status 0 numResult)
+ if(${numResult} MATCHES "0")
+
+ if(dwnlezf_VERBOSE)
+ message(STATUS "Download succeed, so let me rename the tmp file to unzip it")
+ endif()
+ execute_process(COMMAND ${CMAKE_COMMAND} -E rename ${ZIP_DL_PATH}.tmp ${ZIP_DL_PATH})
+ unzip(${ZIP_DL_PATH})
+ writeDirtyUrl()
+
+ else()
+
+ list(GET status 1 errMsg)
+ message(WARNING "DOWNLOAD ${ZIP_URL} to ${ZIP_DL_PATH} failed\n:${errMsg}")
+ message(WARNING "OK, you need to download the ${ZIP_URL} manually and put it into ${ZIP_DL_PATH}")
+ message("Take a look at the project website page to check available URL.")
+
+ endif()
+
+ endif()
+
+ ## clean up the tmp downloaded file
+ if(EXISTS "${ZIP_DL_PATH}.tmp")
+ execute_process(COMMAND ${CMAKE_COMMAND} -E remove ${ZIP_DL_PATH}.tmp)
+ endif()
+
+endfunction()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/git_describe.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/git_describe.cmake
new file mode 100644
index 0000000..638d70b
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/git_describe.cmake
@@ -0,0 +1,114 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+if(__git_describe_INCLUDED__)
+ return()
+else()
+ set(__git_describe_INCLUDED__ ON)
+endif()
+
+find_package(Git)
+if(Git_FOUND)
+ message(STATUS "Git found: ${GIT_EXECUTABLE}")
+else()
+ message(FATAL_ERROR "Git not found. Aborting")
+endif()
+
+macro(git_describe)
+ cmake_parse_arguments(GIT_DESCRIBE "" "GIT_URL;GIT_BRANCH;GIT_COMMIT_HASH;GIT_TAG;GIT_VERSION;PATH" "" ${ARGN})
+
+ if(NOT GIT_DESCRIBE_PATH)
+ set(GIT_DESCRIBE_PATH ${CMAKE_SOURCE_DIR})
+ endif()
+
+ if(GIT_DESCRIBE_GIT_URL)
+ # Get the current remote
+ execute_process(
+ COMMAND git remote
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE GIT_DESCRIBE_GIT_REMOTE
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+
+ # Get the current remote
+ execute_process(
+ COMMAND git remote get-url ${GIT_DESCRIBE_GIT_REMOTE}
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE ${GIT_DESCRIBE_GIT_URL}
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+ endif()
+
+ if(GIT_DESCRIBE_GIT_BRANCH)
+ # Get the current working branch
+ execute_process(
+ COMMAND git rev-parse --abbrev-ref HEAD
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE ${GIT_DESCRIBE_GIT_BRANCH}
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+ endif()
+
+ if(GIT_DESCRIBE_GIT_COMMIT_HASH)
+ # Get the latest abbreviated commit hash of the working branch
+ execute_process(
+ COMMAND git rev-parse HEAD
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE ${GIT_DESCRIBE_GIT_COMMIT_HASH}
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+ endif()
+
+ if(GIT_DESCRIBE_GIT_TAG)
+ # Get the tag
+ execute_process(
+ COMMAND git describe --tags --exact-match
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE ${GIT_DESCRIBE_GIT_TAG}
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+ endif()
+
+ if(GIT_DESCRIBE_GIT_VERSION)
+ # Get the version from git describe
+ execute_process(
+ COMMAND git describe
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE ${GIT_DESCRIBE_GIT_VERSION}
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+
+ if(${GIT_DESCRIBE_GIT_VERSION} STREQUAL "")
+ execute_process(
+ COMMAND git rev-parse --abbrev-ref HEAD
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE GIT_DESCRIBE_GIT_VERSION_BRANCH
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+ execute_process(
+ COMMAND git log -1 --format=%h
+ WORKING_DIRECTORY ${GIT_DESCRIBE_PATH}
+ OUTPUT_VARIABLE GIT_DESCRIBE_GIT_VERSION_COMMIT
+ OUTPUT_STRIP_TRAILING_WHITESPACE
+ ERROR_QUIET
+ )
+
+ set(${GIT_DESCRIBE_GIT_VERSION} "${GIT_DESCRIBE_GIT_VERSION_BRANCH}-${GIT_DESCRIBE_GIT_VERSION_COMMIT}")
+ endif()
+ endif()
+
+endmacro()
\ No newline at end of file
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/include_once.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/include_once.cmake
new file mode 100644
index 0000000..d28b39c
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/include_once.cmake
@@ -0,0 +1,22 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+macro(include_once file)
+ get_filename_component(INCLUDE_ONCE_FILEPATH ${file} REALPATH)
+ string(REGEX REPLACE "(\\.|\\/+|\\:|\\\\+)" "_" INCLUDE_ONCE_FILEPATH ${INCLUDE_ONCE_FILEPATH})
+ get_property(INCLUDED_${INCLUDE_ONCE_FILEPATH}_LOCAL GLOBAL PROPERTY INCLUDED_${INCLUDE_ONCE_FILEPATH})
+ if (INCLUDED_${INCLUDE_ONCE_FILEPATH}_LOCAL)
+ return()
+ else()
+ set_property(GLOBAL PROPERTY INCLUDED_${INCLUDE_ONCE_FILEPATH} true)
+
+ include(${file})
+ endif()
+endmacro()
\ No newline at end of file
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/install_runtime.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/install_runtime.cmake
new file mode 100644
index 0000000..3d4b74e
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/install_runtime.cmake
@@ -0,0 +1,880 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+## This file is mainly used to allow runtime installation
+## There are some utilities cmake functions to ease the generic deployement (abstract common usage of cmake)...
+##
+## You cannot run your programm automaticaly from your CNAKE_BINARY_DIR when you build
+## as it will miss all dependencies and ressources files...
+## You have to run install target in order to test your programm.
+##
+## The only one function/macros you may use inside your sub-CMakeLists.txt (sub-project) is :
+## ******************
+## ibr_install_target macro => see documentation at the end of this file
+## ******************
+## It use these utilities cmake functions to abstract the installation in an uniform way for all sub-projects.
+##
+if(__install_runtime_cmake_INCLUDED__)
+ return()
+else()
+ set(__install_runtime_cmake_INCLUDED__ ON)
+endif()
+
+
+##
+## Allow to write a resource config file which contain additional ressource paths
+## (used by IBR_Common Resource system to load shaders and potentialy images, plugins and so on)
+##
+## ADD option list all the paths to add in the file (relative paths are interpreted relative to working dir of the executable)
+## INSTALL option to specify where we want to install this file
+##
+## Example usage:
+## resourceFile(ADD "shaders" "${PROJECT_NAME}_rsc" INSTALL bin)
+##
+macro(resourceFile)
+ cmake_parse_arguments(rsc "" "INSTALL;FILE_PATH;CONFIG_TYPE" "ADD" ${ARGN}) ## both args are directory path
+
+ if(rsc_ADD)
+ unset(IBR_RSC_FILE_CONTENT_LIST)
+ if(EXISTS "${rsc_FILE_PATH}")
+ file(READ "${rsc_FILE_PATH}" IBR_RSC_FILE_CONTENT)
+ string(REGEX REPLACE "\n" ";" IBR_RSC_FILE_CONTENT_LIST "${IBR_RSC_FILE_CONTENT}")
+ endif()
+ list(APPEND IBR_RSC_FILE_CONTENT_LIST "${rsc_ADD}")
+ list(REMOVE_DUPLICATES IBR_RSC_FILE_CONTENT_LIST)
+ file(WRITE "${rsc_FILE_PATH}" "")
+ foreach(rscDir ${IBR_RSC_FILE_CONTENT_LIST})
+ file(APPEND "${rsc_FILE_PATH}" "${rscDir}\n")
+ endforeach()
+ unset(rsc_ADD)
+ endif()
+
+ if(rsc_INSTALL)
+ install(FILES ${rsc_FILE_PATH} CONFIGURATIONS ${rsc_CONFIG_TYPE} DESTINATION ${rsc_INSTALL})
+ unset(rsc_INSTALL)
+ endif()
+endmacro()
+
+
+##
+## Install *.pdb generated file for the current cmake project
+## assuming the output target name is the cmake project name.
+## This macro is useful for crossplateform multi config mode.
+##
+## Usage Example:
+##
+## if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+## installPDB(${PROJECT_NAME} ${CMAKE_BUILD_TYPE} RUNTIME_DEST bin ARCHIVE_DEST lib LIBRARY_DEST lib)
+## endif()
+## foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+## installPDB(${PROJECT_NAME} ${CONFIG_TYPES} RUNTIME_DEST bin ARCHIVE_DEST lib LIBRARY_DEST lib)
+## endforeach()
+##
+macro(installPDB targetName configType)
+ cmake_parse_arguments(instpdb "" "COMPONENT" "ARCHIVE_DEST;LIBRARY_DEST;RUNTIME_DEST" ${ARGN}) ## both args are directory path
+
+ if(NOT MSVC)
+ return()
+ endif()
+
+ ## Check if DESTINATION are provided according to the TYPE of the given target (see install command doc to see correspodances)
+ get_target_property(type ${targetName} TYPE)
+ if(${type} MATCHES "EXECUTABLE" AND instpdb_RUNTIME_DEST)
+ set(pdb_DESTINATION ${instpdb_RUNTIME_DEST})
+ elseif(${type} MATCHES "STATIC_LIBRARY" AND instpdb_ARCHIVE_DEST)
+ set(pdb_DESTINATION ${instpdb_ARCHIVE_DEST})
+ elseif(${type} MATCHES "MODULE_LIBRARY" AND instpdb_LIBRARY_DEST)
+ set(pdb_DESTINATION ${instpdb_LIBRARY_DEST})
+ elseif(${type} MATCHES "SHARED_LIBRARY")
+ if(WIN32 AND instpdb_RUNTIME_DEST)
+ set(pdb_DESTINATION ${instpdb_RUNTIME_DEST})
+ else()
+ set(pdb_DESTINATION ${instpdb_LIBRARY_DEST})
+ endif()
+ endif()
+
+ if(NOT pdb_DESTINATION)
+ set(pdb_DESTINATION bin) ## default destination of the pdb file
+ endif()
+
+ if(NOT instpdb_COMPONENT)
+ set(instpdb_COMPONENT )
+ else()
+ set(instpdb_COMPONENT COMPONENT ${instpdb_COMPONENT})
+ endif()
+
+ string(TOUPPER ${configType} CONFIG_TYPES_UC)
+ get_target_property(PDB_PATH ${targetName} PDB_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC})
+
+ get_target_property(confModePostfix ${targetName} ${CONFIG_TYPES_UC}_POSTFIX)
+ if(NOT confModePostfix)
+ set(confModePostfix "")
+ endif()
+ set_target_properties(${targetName} PROPERTIES PDB_NAME_${CONFIG_TYPES_UC} ${targetName}${confModePostfix})
+ get_target_property(PDB_NAME ${targetName} PDB_NAME_${CONFIG_TYPES_UC})# if not set, this is empty
+
+ if(EXISTS "${PDB_PATH}/${PDB_NAME}.pdb")
+ install(FILES "${PDB_PATH}/${PDB_NAME}.pdb" CONFIGURATIONS ${configType} DESTINATION ${pdb_DESTINATION} ${instpdb_COMPONENT} OPTIONAL)
+ endif()
+endmacro()
+
+
+##
+## Add additional target to install a project independently and based on its component
+## configMode is used to prevent default Release installation (we want also to install in other build/config type)
+##
+macro(installTargetProject targetOfProject targetOfInstallProject)
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ set(configMode ${CMAKE_BUILD_TYPE})
+ elseif(MSVC)
+ ## $(Configuration) will be one of the following : Debug, Release, MinSizeRel, RelWithDebInfo
+ set(configMode $(Configuration))
+ endif()
+ if(configMode)
+ get_target_property(srcFiles ${targetOfProject} SOURCES)
+ add_custom_target( ${targetOfInstallProject} #ALL
+ ${CMAKE_COMMAND} -DBUILD_TYPE=${configMode} -DCOMPONENT=${targetOfInstallProject} -P ${CMAKE_BINARY_DIR}/cmake_install.cmake
+ DEPENDS ${srcFiles}
+ COMMENT "run the installation only for ${targetOfProject}" VERBATIM
+ )
+ add_dependencies(${targetOfInstallProject} ${targetOfProject})
+
+ get_target_property(INSTALL_BUILD_FOLDER ${targetOfProject} FOLDER)
+ set_target_properties(${targetOfInstallProject} PROPERTIES FOLDER ${INSTALL_BUILD_FOLDER})
+ endif()
+endmacro()
+
+# Collect all currently added targets in all subdirectories
+#
+# Parameters:
+# - _result the list containing all found targets
+# - _dir root directory to start looking from
+function(get_all_targets _result _dir)
+ get_property(_subdirs DIRECTORY "${_dir}" PROPERTY SUBDIRECTORIES)
+ foreach(_subdir IN LISTS _subdirs)
+ get_all_targets(${_result} "${_subdir}")
+ endforeach()
+
+ get_directory_property(_sub_targets DIRECTORY "${_dir}" BUILDSYSTEM_TARGETS)
+ set(${_result} ${${_result}} ${_sub_targets} PARENT_SCOPE)
+endfunction()
+
+##
+## Add targets for building and installing subdirectories
+macro(subdirectory_target target directory build_folder)
+ add_custom_target(${target}
+ COMMENT "run build for all projects in this directory" VERBATIM
+ )
+ get_all_targets(ALL_TARGETS ${directory})
+ add_dependencies(${target} ${ALL_TARGETS})
+ add_custom_target(${target}_install
+ ${CMAKE_COMMAND} -DBUILD_TYPE=$ -DCOMPONENT=${target}_install -P ${CMAKE_BINARY_DIR}/cmake_install.cmake
+ COMMENT "run install for all projects in this directory" VERBATIM
+ )
+ add_dependencies(${target}_install ${target})
+
+ set_target_properties(${target} PROPERTIES FOLDER ${build_folder})
+ set_target_properties(${target}_install PROPERTIES FOLDER ${build_folder})
+endmacro()
+
+
+## CMAKE install all required dependencies for an application (included system OS files like msvc*.dll for example)
+##
+## install_runtime(
+## [TARGET name]
+## [PLUGINS name [nameN ...] [PLUGIN_PATH_NAME currentPathName [FROM_REL_PATH matchDirFromCurrentPathName] [PLUGIN_PATH_DEST installDir] ]
+## [PLUGINS ...]
+## [DIRS path [pathN ...] ]
+## [TARGET_LIBRARIES filePath [filePathN ...] ]
+## [TARGET_PACKAGES packageName [packageNameN ...] ]
+## [COMPONENT installComponentName]
+## [PLAUSIBLES_POSTFIX Debug_postfix [MinSizeRel_postfix relWithDebInfo_postfix ...] ]
+## [VERBOSE]
+## )
+##
+## installedFilePathTargetAppToResolve : the final installed targetApp absolute full file path name you want to resolve
+##
+## TARGET : The target app we want to install. If given, it's used to look for link libraries paths (best choice to use, strongly advised to use it)
+##
+## PLUGINS : Some application built use/load some plugins which can't be detect inside its binary,
+## so, here you can specify which plugins the application use/load in order to install them
+## and resolve also there dependencies.
+## With PLUGINS multi FLAGS :
+## PLUGIN_PATH_NAME : The current plugin full file path we want to install
+## FROM_REL_PATH : [optional: default only the file is kept] From which matching dir of the plugin path we want to install (keep the directories structure)
+## PLUGIN_PATH_DEST : [optional: default relative to executable directory] Where (full path to the install directory) we will install the plugin file (or file path)
+##
+## DIRS : A list of directories to looking for dependencies
+## TARGET_LIBRARIES : DEPRECATED (use TARGET flag instead) : The cmake content variables used for the target_link_libraries( ...)
+## TARGET_PACKAGES : DEPRECATED (use TARGET flag instead) : The cmake package names used for the findPackage(...) for your targetApp
+## ADVICE: This flag add entries in cache (like: _DIR), it could be useful to fill these variable!
+## COMPONENT : (default to runtime) Is the component name associated to the installation
+## It is used when you want to install separatly some part of your projets (see install cmake doc)
+## VERBOSE : For debug or to get more informations in the output console
+##
+## Usage:
+## install_runtime(${CMAKE_INSTALL_PREFIX}/${EXECUTABLE_NAME}${CMAKE_EXECUTABLE_SUFFIX}
+## VERBOSE
+## TARGET ${PROJECT_NAME}
+## PLAUSIBLES_POSTFIX _d
+## PLUGINS
+## PLUGIN_PATH_NAME ${PLUGIN_PATH_NAME}${CMAKE_SHARED_MODULE_SUFFIX} ## will be installed (default exec path if no PLUGINS_DEST) and then will be resolved
+## FROM_REL_PATH plugins ## optional, used especially for keeping qt plugins tree structure
+## PLUGIN_PATH_DEST ${CMAKE_INSTALL_PREFIX}/plugins ## (or relative path 'plugins' will be interpreted relative to installed executable)
+## DIRS ${CMAKE_CURRENT_BINARY_DIR} ${CMAKE_BINARY_DIR}
+## TARGET_LIBRARIES ${OPENGL_LIBRARIES} ## DEPRECATED (use TARGET flag instead)
+## ${GLEW_LIBRARIES}
+## ${GLUT_LIBRARIES}
+## ${Boost_LIBRARIES}
+## ${SuiteSparse_LIBRARIES}
+## ${CGAL_LIBRARIES}
+## TARGET_PACKAGES OPENGL ## DEPRECATED (use TARGET flag instead)
+## GLEW
+## GLUT
+## CGAL
+## Boost
+## SuiteSparse
+## )
+##
+## For plugins part, it use our internal parse_arguments_multi.cmake
+##
+function(install_runtime installedFilePathTargetAppToResolve)
+ set(optionsArgs "VERBOSE")
+ set(oneValueArgs "COMPONENT;INSTALL_FOLDER;CONFIG_TYPE")
+ set(multiValueArgs "DIRS;PLUGINS;TARGET_LIBRARIES;TARGET_PACKAGES;TARGET;PLAUSIBLES_POSTFIX")
+ cmake_parse_arguments(inst_run "${optionsArgs}" "${oneValueArgs}" "${multiValueArgs}" ${ARGN} )
+
+ if(IS_ABSOLUTE ${installedFilePathTargetAppToResolve})
+ else()
+ set(installedFilePathTargetAppToResolve ${inst_run_INSTALL_FOLDER}/${installedFilePathTargetAppToResolve})
+ endif()
+
+ get_filename_component(EXEC_NAME ${installedFilePathTargetAppToResolve} NAME_WE)
+ get_filename_component(EXEC_PATH ${installedFilePathTargetAppToResolve} PATH)
+
+ if(NOT inst_run_COMPONENT)
+ set(inst_run_COMPONENT runtime)
+ endif()
+
+
+ ## Try to append as more possible as possible paths to find dependencies (deprecated since we can use target_properties to get back paths)
+ set(libPaths )
+ foreach(libraryFileName ${inst_run_TARGET_LIBRARIES})
+ if(IS_DIRECTORY "${libraryFileName}")
+ list(APPEND libPaths "${libraryFileName}")
+ else()
+ get_filename_component(libpath "${libraryFileName}" PATH)
+ if(EXISTS "${libpath}")
+ list(APPEND libPaths "${libpath}")
+ endif()
+ endif()
+ endforeach()
+
+ ## This macro is used internaly here to recursilvely get path of LINK_LIBRARIES of each non imported target
+ ## Typically if you have 2 internal dependencies between cmake targets, we want cmake to be able to get back path where are these dependencies
+ macro(recurseDepList target)
+ get_target_property(linkLibs ${target} LINK_LIBRARIES)
+ foreach(lib ${linkLibs})
+ string(FIND ${lib} ">" strId) ## cmake is using generator-expression?
+ if(TARGET ${lib})
+ ## Skipping interface libraries as they're system ones
+ get_target_property(type ${lib} TYPE)
+ get_target_property(imported ${lib} IMPORTED)
+ if(type STREQUAL "INTERFACE_LIBRARY")
+ get_target_property(imp_loc ${lib} INTERFACE_IMPORTED_LOCATION)
+ if(imp_loc)
+ get_filename_component(imp_loc ${imp_loc} PATH)
+ list(APPEND targetLibPath ${imp_loc})
+ endif()
+ get_target_property(loc ${lib} INTERFACE_LOCATION)
+ if(loc)
+ get_filename_component(loc ${loc} PATH)
+ list(APPEND targetLibPath ${loc})
+ endif()
+ ## it's not a path but a single target name
+ ## for build-target which are part of the current cmake configuration : nothing to do as cmake already know the output path
+ ## for imported target, we need to look for theire imported location
+ elseif(imported)
+ get_target_property(imp_loc ${lib} IMPORTED_LOCATION)
+ if(imp_loc)
+ get_filename_component(imp_loc ${imp_loc} PATH)
+ list(APPEND targetLibPath ${imp_loc})
+ endif()
+ get_target_property(loc ${lib} LOCATION)
+ if(loc)
+ get_filename_component(loc ${loc} PATH)
+ list(APPEND targetLibPath ${loc})
+ endif()
+ else()
+ recurseDepList(${lib})
+ endif()
+ elseif(NOT ${strId} MATCHES -1) ## mean cmake use generator-expression (CMAKE VERSION > 3.0)
+ string(REGEX MATCH ">:[@A-Za-z_:/.0-9-]+" targetLibPath ${lib})
+ string(REGEX REPLACE ">:([@A-Za-z_:/.0-9-]+)" "\\1" targetLibPath ${targetLibPath})
+ get_filename_component(targetLibPath ${targetLibPath} PATH)
+ elseif(EXISTS ${lib})
+ set(targetLibPath ${lib})
+ get_filename_component(targetLibPath ${targetLibPath} PATH)
+ else()
+ #message(STATUS "[install_runtime] skip link library : ${lib} , of target ${target}")
+ endif()
+ if(targetLibPath)
+ list(APPEND targetLinkLibsPathList ${targetLibPath})
+ endif()
+ endforeach()
+ if(targetLinkLibsPathList)
+ list(REMOVE_DUPLICATES targetLinkLibsPathList)
+ endif()
+ endmacro()
+ if(inst_run_TARGET)
+ recurseDepList(${inst_run_TARGET})
+ if(targetLinkLibsPathList)
+ list(APPEND libPaths ${targetLinkLibsPathList})
+ endif()
+ endif()
+
+ if(libPaths)
+ list(REMOVE_DUPLICATES libPaths)
+ foreach(libPath ${libPaths})
+ get_filename_component(path ${libPath} PATH)
+ list(APPEND libPaths ${path})
+ endforeach()
+ endif()
+
+
+ ## possible speciale dir(s) according to the build system and OS
+ if(CMAKE_SIZEOF_VOID_P EQUAL 8)
+ set(BUILD_TYPES_FOR_DLL "x64")
+ if(WIN32)
+ list(APPEND BUILD_TYPES_FOR_DLL "Win64")
+ endif()
+ else()
+ set(BUILD_TYPES_FOR_DLL "x86")
+ if(WIN32)
+ list(APPEND BUILD_TYPES_FOR_DLL "Win32")
+ endif()
+ endif()
+
+
+ ## Try to append as more as possible paths to find dependencies (here, mainly for *.dll)
+ foreach(dir ${inst_run_DIRS} ${libPaths})
+ if(EXISTS "${dir}/bin")
+ list(APPEND inst_run_DIRS "${dir}/bin")
+ elseif(EXISTS "${dir}")
+ list(APPEND inst_run_DIRS "${dir}")
+ endif()
+ endforeach()
+ list(REMOVE_DUPLICATES inst_run_DIRS)
+ foreach(dir ${inst_run_DIRS})
+ if(EXISTS "${dir}")
+ list(APPEND argDirs ${dir})
+ foreach(BUILD_TYPE_FOR_DLL ${BUILD_TYPES_FOR_DLL})
+ if(EXISTS "${dir}/${BUILD_TYPE_FOR_DLL}")
+ list(APPEND argDirs "${dir}/${BUILD_TYPE_FOR_DLL}")
+ endif()
+ foreach(OUTPUTCONFIG ${CMAKE_CONFIGURATION_TYPES}) ## for windows multi-generator (MSVC)
+ if(EXISTS "${dir}/${BUILD_TYPE_FOR_DLL}/${OUTPUTCONFIG}")
+ list(APPEND argDirs "${dir}/${BUILD_TYPE_FOR_DLL}/${OUTPUTCONFIG}")
+ endif()
+ endforeach()
+ if(CMAKE_BUILD_TYPE) ## for single generator (makefiles)
+ if(EXISTS "${dir}/${BUILD_TYPE_FOR_DLL}/${CMAKE_BUILD_TYPE}")
+ list(APPEND argDirs "${dir}/${BUILD_TYPE_FOR_DLL}/${CMAKE_BUILD_TYPE}")
+ endif()
+ endif()
+ endforeach()
+ foreach(OUTPUTCONFIG ${CMAKE_CONFIGURATION_TYPES}) ## for windows multi-generator (MSVC)
+ if(EXISTS "${dir}/${OUTPUTCONFIG}")
+ list(APPEND argDirs "${dir}/${OUTPUTCONFIG}")
+ endif()
+ foreach(BUILD_TYPE_FOR_DLL ${BUILD_TYPES_FOR_DLL})
+ if(EXISTS "${dir}/${OUTPUTCONFIG}/${BUILD_TYPE_FOR_DLL}")
+ list(APPEND argDirs "${dir}/${OUTPUTCONFIG}/${BUILD_TYPE_FOR_DLL}")
+ endif()
+ endforeach()
+ endforeach()
+ if(CMAKE_BUILD_TYPE) ## for single generator (makefiles)
+ if(EXISTS "${dir}/${CMAKE_BUILD_TYPE}")
+ list(APPEND argDirs "${dir}/${CMAKE_BUILD_TYPE}")
+ endif()
+ foreach(BUILD_TYPE_FOR_DLL ${BUILD_TYPES_FOR_DLL})
+ if(EXISTS "${dir}/${CMAKE_BUILD_TYPE}/${BUILD_TYPE_FOR_DLL}")
+ list(APPEND argDirs "${dir}/${CMAKE_BUILD_TYPE}/${BUILD_TYPE_FOR_DLL}")
+ endif()
+ endforeach()
+ endif()
+ endif()
+ endforeach()
+ if(argDirs)
+ list(REMOVE_DUPLICATES argDirs)
+ endif()
+
+
+ ## Try to append as more possible paths to find dependencies (here, mainly for *.dll)
+ foreach(packageName ${inst_run_TARGET_PACKAGES})
+ if(EXISTS "${${packageName}_DIR}")
+ list(APPEND packageDirs ${${packageName}_DIR})
+ list(APPEND packageDirs ${${packageName}_DIR}/bin)
+ foreach(BUILD_TYPE_FOR_DLL ${BUILD_TYPES_FOR_DLL})
+ if(EXISTS "${${packageName}_DIR}/bin/${BUILD_TYPE_FOR_DLL}")
+ list(APPEND packageDirs "${${packageName}_DIR}/bin/${BUILD_TYPE_FOR_DLL}")
+ endif()
+ foreach(OUTPUTCONFIG ${CMAKE_CONFIGURATION_TYPES}) ## for windows multi-generator (MSVC)
+ if(EXISTS "${${packageName}_DIR}/bin/${BUILD_TYPE_FOR_DLL}/${OUTPUTCONFIG}")
+ list(APPEND packageDirs "${${packageName}_DIR}/bin/${BUILD_TYPE_FOR_DLL}/${OUTPUTCONFIG}")
+ endif()
+ endforeach()
+ if(CMAKE_BUILD_TYPE) ## for single generator (makefiles)
+ if(EXISTS "${${packageName}_DIR}/bin/${BUILD_TYPE_FOR_DLL}/${CMAKE_BUILD_TYPE}")
+ list(APPEND packageDirs "${${packageName}_DIR}/bin/${BUILD_TYPE_FOR_DLL}/${CMAKE_BUILD_TYPE}")
+ endif()
+ endif()
+ endforeach()
+ foreach(OUTPUTCONFIG ${CMAKE_CONFIGURATION_TYPES}) ## for windows multi-generator (MSVC)
+ if(EXISTS "${${packageName}_DIR}/bin/${OUTPUTCONFIG}")
+ list(APPEND packageDirs "${${packageName}_DIR}/bin/${OUTPUTCONFIG}")
+ endif()
+ foreach(BUILD_TYPE_FOR_DLL ${BUILD_TYPES_FOR_DLL})
+ if(EXISTS "${${packageName}_DIR}/bin/${OUTPUTCONFIG}/${BUILD_TYPE_FOR_DLL}")
+ list(APPEND packageDirs "${${packageName}_DIR}/bin/${OUTPUTCONFIG}/${BUILD_TYPE_FOR_DLL}")
+ endif()
+ endforeach()
+ endforeach()
+ if(CMAKE_BUILD_TYPE) ## for single generator (makefiles)
+ if(EXISTS "${${packageName}_DIR}/bin/${CMAKE_BUILD_TYPE}")
+ list(APPEND packageDirs "${${packageName}_DIR}/bin/${CMAKE_BUILD_TYPE}")
+ endif()
+ foreach(BUILD_TYPE_FOR_DLL ${BUILD_TYPES_FOR_DLL})
+ if(EXISTS "${${packageName}_DIR}/bin/${CMAKE_BUILD_TYPE}/${BUILD_TYPE_FOR_DLL}")
+ list(APPEND packageDirs "${${packageName}_DIR}/bin/${CMAKE_BUILD_TYPE}/${BUILD_TYPE_FOR_DLL}")
+ endif()
+ endforeach()
+ endif()
+ else()
+ set(${packageName}_DIR "$ENV{${packageName}_DIR}" CACHE PATH "${packageName}_DIR root directory for looking for dirs containning *.dll")
+ endif()
+ endforeach()
+ if(packageDirs)
+ list(REMOVE_DUPLICATES packageDirs)
+ endif()
+
+
+ set(dirsToLookFor "${EXEC_PATH}")
+ if(packageDirs)
+ list(APPEND dirsToLookFor ${packageDirs})
+ endif()
+ if(argDirs)
+ list(APPEND dirsToLookFor ${argDirs})
+ endif()
+ get_property(used_LINK_DIRECTORIES DIRECTORY PROPERTY LINK_DIRECTORIES)
+ if (used_LINK_DIRECTORIES)
+ list(APPEND dirsToLookFor ${used_LINK_DIRECTORIES})
+ list(REMOVE_DUPLICATES dirsToLookFor)
+ endif()
+
+
+ ## handle plugins
+ set(pluginsList "")
+ include(parse_arguments_multi) ## this function will process recursively items of the sub-list [default print messages]
+ function(parse_arguments_multi_function results)
+ cmake_parse_arguments(pamf "VERBOSE" "PLUGIN_PATH_DEST;FROM_REL_PATH;EXEC_PATH;COMPONENT" "" ${ARGN}) ## EXEC_PATH and COMPONENT are for exclusive internal use
+ list(REMOVE_DUPLICATES pamf_UNPARSED_ARGUMENTS)
+ foreach(PLUGIN_PATH_NAME ${pamf_UNPARSED_ARGUMENTS})
+ if(EXISTS ${PLUGIN_PATH_NAME})
+ if(IS_DIRECTORY ${PLUGIN_PATH_NAME})
+ if(pamf_VERBOSE)
+ message(WARNING "${PLUGIN_PATH_NAME} IS_DIRECTORY, cannot installed a directory, please give a path filename")
+ endif()
+ else()
+ if(NOT pamf_PLUGIN_PATH_DEST)
+ set(PLUGIN_PATH_DEST ${pamf_EXEC_PATH}) ## the default dest value
+ else()
+ set(PLUGIN_PATH_DEST ${pamf_PLUGIN_PATH_DEST})
+ endif()
+
+ if(pamf_FROM_REL_PATH)
+ file(TO_CMAKE_PATH ${PLUGIN_PATH_NAME} PLUGIN_PATH_NAME)
+ get_filename_component(PLUGIN_PATH ${PLUGIN_PATH_NAME} PATH)
+ unset(PLUGIN_PATH_LIST)
+ unset(PLUGIN_PATH_LIST_COUNT)
+ unset(PLUGIN_REL_PATH_LIST)
+ unset(PLUGIN_REL_PATH)
+ string(REPLACE "/" ";" PLUGIN_PATH_LIST ${PLUGIN_PATH}) ## create a list of dir
+ list(FIND PLUGIN_PATH_LIST ${pamf_FROM_REL_PATH} id)
+ list(LENGTH PLUGIN_PATH_LIST PLUGIN_PATH_LIST_COUNT)
+ if(${id} GREATER 0)
+ math(EXPR id "${id}+1") ## matches relative path not include
+ math(EXPR PLUGIN_PATH_LIST_COUNT "${PLUGIN_PATH_LIST_COUNT}-1") ## the end of the list
+ foreach(i RANGE ${id} ${PLUGIN_PATH_LIST_COUNT})
+ list(GET PLUGIN_PATH_LIST ${i} out)
+ list(APPEND PLUGIN_REL_PATH_LIST ${out})
+ endforeach()
+ foreach(dir ${PLUGIN_REL_PATH_LIST})
+ set(PLUGIN_REL_PATH "${PLUGIN_REL_PATH}/${dir}")
+ endforeach()
+ endif()
+ set(PLUGIN_PATH_DEST ${PLUGIN_PATH_DEST}${PLUGIN_REL_PATH})
+ endif()
+
+ install(FILES ${PLUGIN_PATH_NAME} CONFIGURATIONS ${inst_run_CONFIG_TYPE} DESTINATION ${PLUGIN_PATH_DEST} COMPONENT ${pamf_COMPONENT})
+ get_filename_component(pluginName ${PLUGIN_PATH_NAME} NAME)
+ if(IS_ABSOLUTE ${PLUGIN_PATH_DEST})
+ else()
+ set(PLUGIN_PATH_DEST ${inst_run_INSTALL_FOLDER}/${PLUGIN_PATH_DEST})
+ endif()
+ list(APPEND pluginsList ${PLUGIN_PATH_DEST}/${pluginName})
+ endif()
+ else()
+ message(WARNING "You need to provide a valid PLUGIN_PATH_NAME")
+ set(pluginsList )
+ endif()
+ endforeach()
+ set(${results} ${pluginsList} PARENT_SCOPE)
+ endfunction()
+
+ if(inst_run_VERBOSE)
+ list(APPEND extra_flags_to_add VERBOSE)
+ endif()
+ list(APPEND extra_flags_to_add EXEC_PATH ${EXEC_PATH} COMPONENT ${inst_run_COMPONENT}) ## for internal use inside overloaded function
+ list(LENGTH inst_run_PLUGINS inst_run_PLUGINS_count)
+ if(${inst_run_PLUGINS_count} GREATER 0)
+ parse_arguments_multi(PLUGIN_PATH_NAME inst_run_PLUGINS ${inst_run_PLUGINS} ## see internal overload parse_arguments_multi_function for processing each sub-list
+ NEED_RESULTS ${inst_run_PLUGINS_count} ## this is used to check when we are in the first loop (in order to reset parse_arguments_multi_results)
+ EXTRAS_FLAGS ${extra_flags_to_add} ## this is used to allow catching additional internal flags of our overloaded function
+ )
+ endif()
+
+ #message(parse_arguments_multi_results = ${parse_arguments_multi_results})
+ list(APPEND pluginsList ${parse_arguments_multi_results})
+
+
+
+ ## Install rules for required system runtimes such as MSVCRxx.dll
+ set(CMAKE_INSTALL_SYSTEM_RUNTIME_LIBS_SKIP ON)
+ include(InstallRequiredSystemLibraries)
+ if(CMAKE_INSTALL_SYSTEM_RUNTIME_LIBS)
+ install(FILES ${CMAKE_INSTALL_SYSTEM_RUNTIME_LIBS}
+ CONFIGURATIONS ${inst_run_CONFIG_TYPE}
+ DESTINATION ${EXEC_PATH}
+ COMPONENT ${inst_run_COMPONENT}
+ )
+ endif()
+
+ ## print what we are doing to do
+ if(inst_run_VERBOSE)
+ message(STATUS "[install_runtime] On install target call, cmake will try to resolve dependencies for given app:\n ${installedFilePathTargetAppToResolve} (with plausible postfix: ${inst_run_PLAUSIBLES_POSTFIX})")
+ if(pluginsList)
+ message(STATUS " and also for plugins :")
+ foreach(plugin ${pluginsList})
+ message(STATUS " ${plugin}")
+ endforeach()
+ endif()
+ message(STATUS " Looking for dependencies into:")
+ foreach(dir ${dirsToLookFor})
+ message(STATUS " ${dir}")
+ endforeach()
+ endif()
+
+ ## Install rules for required dependencies libs/plugins for the target app
+ ## will resolve all installed target files with config modes postfixes
+ string(TOUPPER ${inst_run_CONFIG_TYPE} inst_run_CONFIG_TYPE_UC)
+ get_target_property(postfix ${inst_run_TARGET} "${inst_run_CONFIG_TYPE_UC}_POSTFIX")
+ install(CODE "set(target \"${inst_run_TARGET}\")" COMPONENT ${inst_run_COMPONENT} CONFIGURATIONS ${CONFIG_TYPE})
+ install(CODE "set(inst_run_CONFIG_TYPE \"${inst_run_CONFIG_TYPE}\")" COMPONENT ${inst_run_COMPONENT} CONFIGURATIONS ${CONFIG_TYPE})
+ install(CODE "set(inst_run_INSTALL_FOLDER \"${inst_run_INSTALL_FOLDER}\")" COMPONENT ${inst_run_COMPONENT} CONFIGURATIONS ${CONFIG_TYPE})
+ install(CODE "set(app \"${EXEC_PATH}/${EXEC_NAME}${postfix}${CMAKE_EXECUTABLE_SUFFIX}\")" COMPONENT ${inst_run_COMPONENT} CONFIGURATIONS ${CONFIG_TYPE})
+ install(CODE "set(dirsToLookFor \"${dirsToLookFor}\")" COMPONENT ${inst_run_COMPONENT} CONFIGURATIONS ${CONFIG_TYPE})
+ install(CODE
+ [[
+ if("${CMAKE_INSTALL_CONFIG_NAME}" STREQUAL "${inst_run_CONFIG_TYPE}")
+ message(STATUS "Installing ${target} dependencies...")
+
+ file(GET_RUNTIME_DEPENDENCIES
+ EXECUTABLES ${app}
+ RESOLVED_DEPENDENCIES_VAR _r_deps
+ UNRESOLVED_DEPENDENCIES_VAR _u_deps
+ CONFLICTING_DEPENDENCIES_PREFIX _c_deps
+ DIRECTORIES ${dirsToLookFor}
+ PRE_EXCLUDE_REGEXES "api-ms-*"
+ POST_EXCLUDE_REGEXES ".*system32/.*\\.dll" ".*SysWOW64/.*\\.dll"
+ )
+
+ if(_u_deps)
+ message(WARNING "There were unresolved dependencies for executable ${EXEC_FILE}: \"${_u_deps}\"!")
+ endif()
+ if(_c_deps_FILENAMES)
+ message(WARNING "There were conflicting dependencies for executable ${EXEC_FILE}: \"${_c_deps_FILENAMES}\"!")
+ endif()
+
+ foreach(_file ${_r_deps})
+ file(INSTALL
+ DESTINATION "${inst_run_INSTALL_FOLDER}/bin"
+ TYPE SHARED_LIBRARY
+ FOLLOW_SYMLINK_CHAIN
+ FILES "${_file}"
+ )
+ endforeach()
+ endif()
+ ]]
+ COMPONENT ${inst_run_COMPONENT} CONFIGURATIONS ${CONFIG_TYPE}
+ )
+
+
+endfunction()
+
+## High level macro to install resources in the correct folder
+##
+## EXECUTABLE: [opt] option to copy files as programs
+## RELATIVE : [opt] copy files relatively to current folder
+## TYPE : [opt] type and folder where to store the files
+## FOLDER : [opt] subfolder to use
+## FILES : [opt] contains a list of resources files to copy to install folder
+macro(ibr_install_rsc target)
+ cmake_parse_arguments(install_rsc_${target} "EXECUTABLE;RELATIVE" "TYPE;FOLDER" "FILES" ${ARGN})
+ set(rsc_target "${target}_${install_rsc_${target}_TYPE}")
+
+ if(install_rsc_${target}_FOLDER)
+ set(rsc_folder "${install_rsc_${target}_TYPE}/${install_rsc_${target}_FOLDER}")
+ else()
+ set(rsc_folder "${install_rsc_${target}_TYPE}")
+ endif()
+
+ add_custom_target(${rsc_target}
+ COMMENT "run the ${install_rsc_${target}_TYPE} installation only for ${target} (component ${rsc_target})"
+ VERBATIM)
+ foreach(scriptFile ${install_rsc_${target}_FILES})
+ if(install_rsc_${target}_RELATIVE)
+ file(RELATIVE_PATH relativeFilename ${CMAKE_CURRENT_SOURCE_DIR} ${scriptFile})
+ else()
+ get_filename_component(relativeFilename ${scriptFile} NAME)
+ endif()
+
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ add_custom_command(TARGET ${rsc_target} POST_BUILD
+ COMMAND ${CMAKE_COMMAND} -E
+ copy_if_different ${scriptFile} ${CMAKE_INSTALL_PREFIX_${CMAKE_BUILD_TYPE}}/${rsc_folder}/${relativeFilename})
+ endif()
+ foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ add_custom_command(TARGET ${rsc_target} POST_BUILD
+ COMMAND ${CMAKE_COMMAND} -E
+ copy_if_different ${scriptFile} ${CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC}}/${rsc_folder}/${relativeFilename})
+ endforeach()
+ endforeach()
+
+ get_target_property(INSTALL_RSC_BUILD_FOLDER ${target} FOLDER)
+ set_target_properties(${rsc_target} PROPERTIES FOLDER ${INSTALL_RSC_BUILD_FOLDER})
+
+ add_dependencies(${target} ${rsc_target})
+ add_dependencies(PREBUILD ${rsc_target})
+
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ resourceFile(ADD ${rsc_folder} CONFIG_TYPE ${CMAKE_BUILD_TYPE} FILE_PATH "${CMAKE_INSTALL_PREFIX_${CMAKE_BUILD_TYPE}}/ibr_resources.ini")
+
+ if(install_rsc_${target}_EXECUTABLE)
+ install(
+ PROGRAMS ${install_rsc_${target}_FILES}
+ CONFIGURATIONS ${CMAKE_BUILD_TYPE}
+ DESTINATION "${CMAKE_INSTALL_PREFIX_${CMAKE_BUILD_TYPE}}/${rsc_folder}"
+ )
+ else()
+ install(
+ FILES ${install_rsc_${target}_FILES}
+ CONFIGURATIONS ${CMAKE_BUILD_TYPE}
+ DESTINATION "${CMAKE_INSTALL_PREFIX_${CMAKE_BUILD_TYPE}}/${rsc_folder}"
+ )
+ endif()
+ endif()
+ foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ resourceFile(ADD ${rsc_folder} CONFIG_TYPE ${CONFIG_TYPES} FILE_PATH "${CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC}}/ibr_resources.ini")
+
+ if(install_rsc_${target}_EXECUTABLE)
+ install(
+ PROGRAMS ${install_rsc_${target}_FILES}
+ CONFIGURATIONS ${CONFIG_TYPES}
+ DESTINATION "${CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC}}/${rsc_folder}"
+ )
+ else()
+ install(
+ FILES ${install_rsc_${target}_FILES}
+ CONFIGURATIONS ${CONFIG_TYPES}
+ DESTINATION "${CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC}}/${rsc_folder}"
+ )
+ endif()
+ endforeach()
+endmacro()
+
+
+## High level macro to install in an homogen way all our ibr targets (it use some functions inside this file)
+##
+## RSC_FILE_ADD : [opt] is used to auto write/append relative paths of target resources into a common file
+## INSTALL_PDB : [opt] is used to auto install PDB file (when using MSVC according to the target type)
+## STANDALONE : [opt] bool ON/OFF var to call install_runtime or not (for bundle resolution)
+## DIRS : [opt] used if STANDALONE set to ON, see install_runtime doc
+## PLUGINS: [opt] used if STANDALONE set to ON, see install_runtime doc
+## MSVC_CMD : [opt] used to specify an absolute filePathName application to launch with the MSVC IDE Debugger associated to this target (project file)
+## MSVC_ARGS : [opt] load the MSVC debugger with correct settings (app path, args, working dir)
+##
+macro(ibr_install_target target)
+ cmake_parse_arguments(ibrInst${target} "VERBOSE;INSTALL_PDB" "COMPONENT;MSVC_ARGS;STANDALONE;RSC_FOLDER" "SHADERS;RESOURCES;SCRIPTS;DIRS;PLUGINS" ${ARGN})
+
+ if(ibrInst${target}_RSC_FOLDER)
+ set(rsc_folder "${ibrInst${target}_RSC_FOLDER}")
+ else()
+ set(rsc_folder "${target}")
+ endif()
+
+ if(ibrInst${target}_SHADERS)
+ ibr_install_rsc(${target} EXECUTABLE TYPE "shaders" FOLDER ${rsc_folder} FILES "${ibrInst${target}_SHADERS}")
+ endif()
+
+ if(ibrInst${target}_RESOURCES)
+ ibr_install_rsc(${target} TYPE "resources" FOLDER ${rsc_folder} FILES "${ibrInst${target}_RESOURCES}")
+ endif()
+
+ if(ibrInst${target}_SCRIPTS)
+ ibr_install_rsc(${target} EXECUTABLE TYPE "scripts" FOLDER ${rsc_folder} FILES "${ibrInst${target}_SCRIPTS}")
+ endif()
+
+ if(ibrInst${target}_COMPONENT)
+ set(installCompArg COMPONENT ${ibrInst${target}_COMPONENT})
+ ## Create a custom install target based on COMPONENT
+ installTargetProject(${target} ${ibrInst${target}_COMPONENT})
+ endif()
+
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ string(TOUPPER ${CMAKE_BUILD_TYPE} CONFIG_TYPES_UC)
+ set_target_properties(${target} PROPERTIES ${CONFIG_TYPES_UC}_POSTFIX "${CMAKE_${CONFIG_TYPES_UC}_POSTFIX}")
+ get_target_property(CURRENT_TARGET_BUILD_TYPE_POSTFIX ${target} ${CONFIG_TYPES_UC}_POSTFIX)
+ endif()
+ foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ set_target_properties(${target} PROPERTIES ${CONFIG_TYPES_UC}_POSTFIX "${CMAKE_${CONFIG_TYPES_UC}_POSTFIX}")
+ get_target_property(CURRENT_TARGET_BUILD_TYPE_POSTFIX ${target} ${CONFIG_TYPES_UC}_POSTFIX)
+ endforeach()
+
+ ## Specify default installation rules
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ string(TOUPPER ${CMAKE_BUILD_TYPE} CONFIG_TYPES_UC)
+ install(TARGETS ${target}
+ CONFIGURATIONS ${CMAKE_BUILD_TYPE}
+ LIBRARY DESTINATION ${CMAKE_LIBRARY_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}} ${installCompArg}
+ ARCHIVE DESTINATION ${CMAKE_ARCHIVE_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}} ${installCompArg}
+ RUNTIME DESTINATION ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}} ${installCompArg}
+ )
+ endif()
+ foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ install(TARGETS ${target}
+ CONFIGURATIONS ${CONFIG_TYPES}
+ LIBRARY DESTINATION ${CMAKE_LIBRARY_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}} ${installCompArg}
+ ARCHIVE DESTINATION ${CMAKE_ARCHIVE_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}} ${installCompArg}
+ RUNTIME DESTINATION ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}} ${installCompArg}
+ )
+ endforeach()
+
+ if(ibrInst${target}_INSTALL_PDB)
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ installPDB(${target} ${CMAKE_BUILD_TYPE}
+ LIBRARY_DEST ${CMAKE_LIBRARY_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}}
+ ARCHIVE_DEST ${CMAKE_ARCHIVE_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}}
+ RUNTIME_DEST ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CMAKE_BUILD_TYPE}}
+ )
+ endif()
+ foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ installPDB(${target} ${CONFIG_TYPES}
+ LIBRARY_DEST ${CMAKE_LIBRARY_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}}
+ ARCHIVE_DEST ${CMAKE_ARCHIVE_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}}
+ RUNTIME_DEST ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}}
+ )
+ endforeach()
+ endif()
+
+ ## install dynamic necessary dependencies
+ if(ibrInst${target}_STANDALONE)
+ get_target_property(type ${target} TYPE)
+ if(${type} MATCHES "EXECUTABLE")
+
+ if(ibrInst${target}_VERBOSE)
+ set(VERBOSE VERBOSE)
+ else()
+ set(VERBOSE )
+ endif()
+
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ install_runtime(bin/${target}${CMAKE_EXECUTABLE_SUFFIX} ## default relative to CMAKE_INSTALL_PREFIX
+ INSTALL_FOLDER "${CMAKE_INSTALL_PREFIX_${CMAKE_BUILD_TYPE}}"
+ CONFIG_TYPE ${CMAKE_BUILD_TYPE}
+ ${VERBOSE}
+ TARGET ${target}
+ ${installCompArg}
+ PLUGINS ## will be installed
+ ${ibrInst${target}_PLUGINS}
+ DIRS ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}}
+ ${ibrInst${target}_DIRS}
+ )
+ endif()
+ foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ install_runtime(bin/${target}${CMAKE_EXECUTABLE_SUFFIX} ## default relative to CMAKE_INSTALL_PREFIX
+ INSTALL_FOLDER "${CMAKE_INSTALL_PREFIX_${CONFIG_TYPES_UC}}"
+ CONFIG_TYPE ${CONFIG_TYPES}
+ ${VERBOSE}
+ TARGET ${target}
+ ${installCompArg}
+ PLUGINS ## will be installed
+ ${ibrInst${target}_PLUGINS}
+ DIRS ${CMAKE_RUNTIME_OUTPUT_DIRECTORY_${CONFIG_TYPES_UC}}
+ ${ibrInst${target}_DIRS}
+ )
+ endforeach()
+ else()
+ message(WARNING "STANDALONE option is only compatible with EXECUTABLES target type. Skip the STANDALONE installation process.")
+ endif()
+ endif()
+
+ ## Provide a way to directly load the MSVC debugger with correct settings
+ if(MSVC)
+ if(ibrInst${target}_MSVC_CMD) ## command absolute filePathName is optional as the default is to use the installed target file application
+ set(msvcCmdArg COMMAND ${ibrInst${target}_MSVC_CMD}) ## flag following by the value (both to pass to the MSVCsetUserCommand function)
+ endif()
+ if(ibrInst${target}_MSVC_ARGS) ## args (between quotes) are optional
+ set(msvcArgsArg ARGS ${ibrInst${target}_MSVC_ARGS}) ## flag following by the value (both to pass to the MSVCsetUserCommand function)
+ endif()
+ get_target_property(type ${target} TYPE)
+ if( (ibrInst${target}_MSVC_CMD OR ibrInst${target}_MSVC_ARGS) OR (${type} MATCHES "EXECUTABLE") )
+ include(MSVCsetUserCommand)
+ if(DEFINED CMAKE_BUILD_TYPE) ## for make/nmake based
+ MSVCsetUserCommand( ${target}
+ PATH ${CMAKE_OUTPUT_BIN_${CMAKE_BUILD_TYPE}} ##FILE option not necessary since it deduced from targetName
+ ARGS "${SIBR_PROGRAMARGS}"
+ ${msvcCmdArg}
+ #${msvcArgsArg}
+ WORKING_DIR ${CMAKE_OUTPUT_BIN_${CMAKE_BUILD_TYPE}}
+ )
+ endif()
+ foreach(CONFIG_TYPES ${CMAKE_CONFIGURATION_TYPES}) ## for multi config types (MSVC based)
+ string(TOUPPER ${CONFIG_TYPES} CONFIG_TYPES_UC)
+ MSVCsetUserCommand( ${target}
+ PATH ${CMAKE_OUTPUT_BIN_${CONFIG_TYPES_UC}} ##FILE option not necessary since it deduced from targetName
+ ARGS "${SIBR_PROGRAMARGS}"
+ ${msvcCmdArg}
+ #${msvcArgsArg}
+ WORKING_DIR ${CMAKE_OUTPUT_BIN_${CONFIG_TYPES_UC}}
+ )
+ endforeach()
+ elseif(NOT ${type} MATCHES "EXECUTABLE")
+ #message("Cannot set MSVCsetUserCommand with target ${target} without COMMAND parameter as it is not an executable (skip it)")
+ endif()
+ endif()
+
+endmacro()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/parse_arguments_multi.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/parse_arguments_multi.cmake
new file mode 100644
index 0000000..4f19e41
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/parse_arguments_multi.cmake
@@ -0,0 +1,304 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+if(NOT WIN32 OR __parse_arguments_multi_cmake_INCLUDED__)
+ return()
+else()
+ set(__parse_arguments_multi_cmake_INCLUDED__ ON)
+endif()
+
+## This macro allow to process repeating multi value args from a given function which use cmake_parse_arguments module.
+##
+## cmake_parse_arguments multi args standard behavior:
+## function(foo)
+## cmake_parse_arguments(arg "" "" "MULTI" ${ARGN})
+## foreach(item IN LISTS arg_MULTI)
+## message(STATUS "${item}")
+## endforeach()
+## endfunction()
+## foo(MULTI x y MULTI z w)
+## The above code outputs 'z' and 'w'. It originally expected it to output all of 'x' 'y' 'z' 'w'.
+##
+## Using this macro inside a function which want to handle repeating multi args values
+## will recursively iterate onto the multi tags list to process each sub list.
+## It take as 1st argument the subTag flag to separate sub list from the main multi list.
+## It take as 2nd argument the nameList of the main multi list (the multiValuesArgs from cmake_parse_arguments: here it is MULTI in the example)
+## and that's why it is important that it should be a macro and not a function (to get access to external variable).
+## Then you give the content of this list allowing to be processed by the macro.
+##
+## parse_arguments_multi macro call a parse_arguments_multi_function which do actually the process from the given sub-list.
+## By default this function only print infos about what variables you are trying to pass/process (only verbose messages),
+## but, by overloading this cmake function, you will be able to externalize the process of your multi argument list.
+##
+## Usage (into a function) :
+## parse_arguments_multi(
+## [NEED_RESULTS ] [EXTRAS_FLAGS <...> <...> ...]
+## )
+##
+## Simple usage example [user point of view]:
+## foo(MULTI
+## SUB_MULTI x y
+## SUB_MULTI z w
+## )
+##
+## Simple usage example [inside a function]:
+## function(foo)
+## cmake_parse_arguments(arg "" "" "MULTI" ${ARGN})
+## include(parse_arguments_multi)
+## function(parse_arguments_multi_function )
+## #message("I'm an overloaded cmake function used by parse_arguments_multi")
+## #message("I'm processing first part of my sub list: ${ARGN}")
+## message("ARGV0=${ARGV0}")
+## message("ARGV1=${ARGV1}")
+## endfunction()
+## parse_arguments_multi(SUB_MULTI arg_MULTI ${arg_MULTI}) ## this function will process recusively items of the sub-list [default print messages]
+## endfunction()
+##
+## Will print:
+## ARGV0=z
+## ARGV1=w
+## ARGV0=x
+## ARGV1=y
+##
+## WARNING : DO NEVER ADD EXTRA THINGS TO parse_arguments_multi MACRO :
+## parse_arguments_multi(SUB_MULTI arg_MULTI ${arg_MULTI} EXTRAS foo bar SOMTHING) => will failed !!
+## use EXTRAS_FLAGS instead !!
+##
+## Advanced usage example [user point of view]:
+## bar(C:/prout/test.exe VERBOSE
+## PLUGINS
+## PLUGIN_PATH_NAME x PLUGIN_PATH_DEST w
+## PLUGIN_PATH_NAME a b PLUGIN_PATH_DEST y
+## PLUGIN_PATH_NAME c
+## )
+##
+## Advanced usage example [inside a function]:
+## function(bar execFilePathName)
+## cmake_parse_arguments(arg "VERBOSE" "" "PLUGINS" ${ARGN})
+##
+## include(parse_arguments_multi)
+## function(parse_arguments_multi_function results)
+## cmake_parse_arguments(pamf "VERBOSE" "PLUGIN_PATH_DEST;EXEC_PATH" "" ${ARGN}) ## EXEC_PATH is for internal use
+## message("")
+## message("I'm an overloaded cmake function used by parse_arguments_multi from install_runtime function")
+## message("I'm processing first part of my sub list: ${ARGN}")
+## message("PLUGIN_PATH_NAME = ${pamf_UNPARSED_ARGUMENTS}")
+## message(pamf_VERBOSE = ${pamf_VERBOSE})
+## message("pamf_PLUGIN_PATH_DEST = ${pamf_PLUGIN_PATH_DEST}")
+## message(pamf_EXEC_PATH = ${pamf_EXEC_PATH})
+## if(NOT ${pamf_PLUGIN_PATH_DEST})
+## set(pamf_PLUGIN_PATH_DEST ${pamf_EXEC_PATH})
+## endif()
+## foreach(plugin ${pamf_UNPARSED_ARGUMENTS})
+## get_filename_component(pluginName ${plugin} NAME)
+## list(APPEND pluginsList ${pamf_PLUGIN_PATH_DEST}/${pluginName})
+## endforeach()
+## set(${results} ${pluginsList} PARENT_SCOPE)
+## endfunction()
+##
+## if(arg_VERBOSE)
+## list(APPEND extra_flags_to_add VERBOSE) ## here we transmit the VERNOSE flag
+## endif()
+## get_filename_component(EXEC_PATH ${execFilePathName} PATH) ## will be the default value if PLUGIN_PATH_DEST option is not provided
+## list(APPEND extra_flags_to_add EXEC_PATH ${EXEC_PATH})
+## list(LENGTH arg_PLUGINS arg_PLUGINS_count)
+## parse_arguments_multi(PLUGIN_PATH_NAME arg_PLUGINS ${arg_PLUGINS}
+## NEED_RESULTS ${arg_PLUGINS_count} ## this is used to check when we are in the first loop (in order to reset parse_arguments_multi_results)
+## EXTRAS_FLAGS ${extra_flags_to_add} ## this is used to allow catching VERBOSE and PLUGIN_PATH_DEST flags of our overloaded function
+## )
+## endfunction()
+## message(parse_arguments_multi_results = ${parse_arguments_multi_results}) ## list of the whole pluginsList
+## #Will print w/x;a/y;b/y;C:/prout/c
+##
+## NOTE that here, since our overloaded function need to provide a result list, we use the other parse_arguments_multi_function signature (the which one with a results arg)
+##
+
+function(parse_arguments_multi_function_default) ## used in case of you want to reset the default behavior of this function process
+ message("[default function] parse_arguments_multi_function(ARGC=${ARGC} ARGV=${ARGV} ARGN=${ARGN})")
+ message("This function is used by parse_arguments_multi and have to be overloaded to process sub list of multi values args")
+endfunction()
+
+function(parse_arguments_multi_function ) ## => the function to overload
+ parse_arguments_multi_function_default(${ARGN})
+endfunction()
+
+## first default signature above
+##------------------------------
+## second results signature behind
+
+function(parse_arguments_multi_function_default result) ## used in case of you want to reset the default behavior of this function process
+ message("[default function] parse_arguments_multi_function(ARGC=${ARGC} ARGV=${ARGV} ARGN=${ARGN})")
+ message("This function is used by parse_arguments_multi and have to be overloaded to process sub list of muluti values args")
+endfunction()
+
+function(parse_arguments_multi_function result) ## => the function to overload
+ parse_arguments_multi_function_default(result ${ARGN})
+endfunction()
+
+## => the macro to use inside your function which use cmake_parse_arguments
+# NOTE: entry point of parse_arguments_multi, which is called from win3rdPart)
+macro(parse_arguments_multi multiArgsSubTag multiArgsList #<${multiArgsList}> the content of the list
+)
+ # message (STATUS "")
+ # message(STATUS "calling parse_arguemnts_multi defined in parse_arguments_multi.cmake:141")
+ # message(STATUS "multiArgsSubTag = ${multiArgsSubTag}") # CHECK_CACHED_VAR
+ # message(STATUS "multiArgsList = ${multiArgsList}") # it contains the name of the variable which is holding the list i.e w3p_MULTI_SET
+ # message(STATUS "value of ${multiArgsList} = ${${multiArgsList}}") # a semicolon separated list of values passed to SET or MULTISET keyword in win3rdParty
+ # message(STATUS "actual values ARGN = ${ARGN}") # the same as ${${multiArgsList}}
+
+ ## INFO
+ ## starting from CMake 3.5 cmake_parse_arguments is not a module anymore and now is a native CMake command.
+ ## the behaviour is different though
+ ## In CMake 3.4, if you pass multiple times a multi_value_keyword, CMake returns the values of the LAST match
+ ## In CMake 3.5 and above, CMake returns the whole list of values that were following that multi_value_keyword
+ ## example:
+ ## cmake_parse_arguments(
+ ##
+ ## "" # options
+ ## "" # one value keywords
+ ## "MY_MULTI_VALUE_TAG"
+ ## MY_MULTI_VALUE_TAG value1 value2
+ ## MY_MULTI_VALUE_TAG value3 value4
+ ## MY_MULTI_VALUE_TAG value5 value6
+ ## )
+ ## result in CMake 3.4
+ ## _MY_MULTI_VALUE_TAG = "value5;value6"
+ ##
+ ## result in CMake 3.8
+ ## _MY_MULTI_VALUE_TAG = "value5;value6"
+
+ #include(CMakeParseArguments) #module CMakeParseArguments is obsolete since cmake 3.5
+ # cmake_parse_arguments ( args)
+ # : options (flags) pass to the macro
+ # : options that neeed a value
+ # : options that neeed more than one value
+ cmake_parse_arguments(_pam "" "NEED_RESULTS" "${multiArgsSubTag};EXTRAS_FLAGS" ${ARGN})
+
+ ## multiArgsList is the name of the list used by the multiValuesOption flag from the cmake_parse_arguments of the user function
+ ## that's why we absolutly need to use MACRO here (and also for passing parse_arguments_multi_results when NEED_RESULTS flag is set)
+
+ ## for debugging
+ #message("")
+ #message("[parse_arguments_multi] => ARGN = ${ARGN}")
+ #message("_pam_NEED_RESULTS=${_pam_NEED_RESULTS}")
+ #message("_pam_EXTRAS_FLAGS=${_pam_EXTRAS_FLAGS}")
+ # foreach(var ${_pam_${multiArgsSubTag}})
+ # message("arg=${var}")
+ # endforeach()
+
+ if (${CMAKE_VERSION} VERSION_GREATER "3.5")
+ # lets make ${_pam_${multiArgsSubTag}} behave as it is in version 3.4
+ # that means, cmake_parse_arguments should have only the last values of a multi set for a given keyword
+
+ # message("")
+ # message("values in multiArgsList")
+ # foreach(val ${${multiArgsList}})
+ # message(STATUS ${val})
+ # endforeach()
+ # message("end values in multiArgsList")
+
+
+ set(lastIndexFound OFF)
+ list(LENGTH ${multiArgsList} argnLength)
+ # message(${argnLength})
+ math(EXPR argnLength "${argnLength}-1") # make last index a valid one
+ set(recordIndex 0)
+ set(records "") # clear records list
+ set(record0 "") # clear first record list
+ foreach(iter RANGE ${argnLength})
+ list(GET ${multiArgsList} ${iter} value)
+ # message(STATUS "index=${iter} value=${value}")
+ if (${value} STREQUAL ${multiArgsSubTag})
+ if (lastIndexFound)
+ list(APPEND records ${recordIndex}) # records store the list NAMES
+ math(EXPR recordIndex "${recordIndex}+1")
+ set(record${recordIndex} "") # clear record list
+ else ()
+ set(lastIndexFound ON)
+ endif()
+
+ set(lastIndex ${iter})
+ else ()
+ if (lastIndexFound)
+ # message(${value})
+ list(APPEND record${recordIndex} ${value})
+ endif()
+ endif()
+ endforeach()
+
+ # save the last list of values
+ if (lastIndexFound)
+ list(APPEND records ${recordIndex}) # records store the list NAMES
+ endif()
+
+ # set multiArgsList to make it behave like CMake 3.4
+ # message("")
+ # message("using my records")
+ foreach(recordName ${records})
+ # message(${recordName})
+ # foreach(value ${record${recordName}})
+ # message(${value})
+ # endforeach()
+ # message("")
+ set(_pam_${multiArgsSubTag} ${record${recordName}})
+ endforeach()
+ # message(${_pam_${multiArgsSubTag}})
+
+ # message("")
+ # message("using argn")
+ # foreach(value ${ARGN})
+ # message(${value})
+ # endforeach()
+ endif() # end if cmake > 3.5
+
+ # message("values with pam ${_pam_${multiArgsSubTag}}")
+
+ ## check and init
+ list(LENGTH ${multiArgsList} globalListCount) # GLUT_TRACE: globalListCound=16 in CMake3.4 and CMake3.8
+ # message(STATUS "nr items in multiArgsList: ${globalListCount}")
+ math(EXPR globalListCount "${globalListCount}-1") ## because it will contain [multiArgsSubTag + ${multiArgsList}]
+ if(_pam_NEED_RESULTS)
+ if(${globalListCount} EQUAL ${_pam_NEED_RESULTS})
+ ## first time we enter into this macro (because we call it recursively)
+ unset(parse_arguments_multi_results)
+ endif()
+ endif()
+
+ ## process the part of the multi agrs list
+ ## ${ARGN} shouldn't be passed to the function in order to avoid missmatch size list ${multiArgsList} and _pam_${multiArgsSubTag}
+ ## if you want to pass extra internal flags from your function to this callback, use EXTRAS_FLAGS
+ if(_pam_NEED_RESULTS)
+ parse_arguments_multi_function(parse_arguments_multi_function_result ${_pam_${multiArgsSubTag}} ${_pam_EXTRAS_FLAGS})
+ list(APPEND parse_arguments_multi_results ${parse_arguments_multi_function_result})
+ else()
+ # message(STATUS "about to call parse_arguments_multi_function in parse_arguments_multi.cmake:177 ${_pam_${multiArgsSubTag}} and extra flags ${_pam_EXTRAS_FLAGS}")
+ parse_arguments_multi_function(${_pam_${multiArgsSubTag}} ${_pam_EXTRAS_FLAGS})
+ endif()
+
+ ## remove just processed items from the main list to process (multiArgsList)
+ list(REVERSE ${multiArgsList})
+ list(LENGTH _pam_${multiArgsSubTag} subTagListCount)
+ unset(ids)
+ foreach(id RANGE ${subTagListCount})
+ list(APPEND ids ${id})
+ endforeach()
+ list(REMOVE_AT ${multiArgsList} ${ids})
+ list(REVERSE ${multiArgsList})
+
+ ## test if remain sub multi list to process (recursive call) or finish the process
+ list(LENGTH ${multiArgsList} mainTagListCount)
+ if(${mainTagListCount} GREATER 1)
+ ## do not pass ${ARGN} just because it will re pass the initial 2 inputs args and we wont as they was consumed (in order to avoir conflicts)
+ # message(STATUS "about to call a parse_arguments_multi but without knowing where the definition is going to be taken from")
+ parse_arguments_multi(${multiArgsSubTag} ${multiArgsList} ${${multiArgsList}}
+ NEED_RESULTS ${_pam_NEED_RESULTS} EXTRAS_FLAGS ${_pam_EXTRAS_FLAGS}
+ )
+ endif()
+endmacro()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/sibr_library.cmake b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/sibr_library.cmake
new file mode 100644
index 0000000..61f4219
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/cmake/windows/sibr_library.cmake
@@ -0,0 +1,169 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+# NOTE
+# This feature is used to easily download, store and link external dependencies. This
+# requires to prepare pre-compiled libraries (to download). For now, packages have
+# only be prepare for Windows 64-bit with Visual Studio 2012. (You should re-build
+# everything if you want to use another version of Visual Studio/ another compiler).
+
+# NOTE ABOUT UNIX SYSTEMS
+# There is no need for "searching mechanism". This function is discard and your
+# libraries should be installed is the standard folders that are:
+#
+# /usr/include/
+# /usr/lib/
+# /usr/lib64/
+# for packages downloaded using apt-get/yum
+#
+# /usr/local/include/
+# /usr/local/lib/
+# /usr/local/lib64/
+# for packages manually installed ("make install")
+#
+# if you encounter problems when linking (e.g. lib not found even if it is installed),
+# please check these folders are in your search PATH environment variables.
+
+set(EXTLIBS_PACKAGE_FOLDER "${CMAKE_SOURCE_DIR}/extlibs")
+
+function(sibr_addlibrary)
+ if(NOT WIN32)
+ return()
+ endif()
+
+ file(MAKE_DIRECTORY ${EXTLIBS_PACKAGE_FOLDER})
+ cmake_parse_arguments(args "VCID" "VERBOSE;TIMEOUT;DEFAULT_USE;NAME;VERSION;MSVC11;MSVC12;MSVC14;MSVC17" "MULTI_SET;SET" ${ARGN})
+
+
+ if (NOT "${args_VERSION}" MATCHES "")
+ message(WARNING "VERSION is not implemented yet")
+ endif()
+
+ set(lcname "")
+ set(ucname "")
+ string(TOLOWER "${args_NAME}" lcname)
+ string(TOUPPER "${args_NAME}" ucname)
+
+ set(LIB_PACKAGE_FOLDER "${EXTLIBS_PACKAGE_FOLDER}/${lcname}")
+ win3rdParty(${ucname}
+ $
+ VERBOSE ${args_VERBOSE}
+ TIMEOUT ${args_TIMEOUT}
+ DEFAULT_USE ${args_DEFAULT_USE}
+ MSVC11 "${LIB_PACKAGE_FOLDER}" "${args_MSVC11}"
+ MSVC12 "${LIB_PACKAGE_FOLDER}" "${args_MSVC12}"
+ MSVC14 "${LIB_PACKAGE_FOLDER}" "${args_MSVC14}" # TODO SV: make sure to build this library if required
+ MSVC17 "${LIB_PACKAGE_FOLDER}" "${args_MSVC17}"
+ SET ${args_SET}
+ MULTI_SET ${args_MULTI_SET}
+ )
+
+ # Add include/ directory
+ # and lib/ directories
+
+ # TODO SV: paths not matching with current hierarchy. example: libraw/libraw-0.17.1/include
+ # SR: The link directories will also be used to lookup for dependency DLLs to copy in the install directory.
+ # Some libraries put the DLLs in the bin/ directory, so we include those.
+ file(GLOB subdirs RELATIVE ${LIB_PACKAGE_FOLDER} ${LIB_PACKAGE_FOLDER}/*)
+ set(dirlist "")
+ foreach(dir ${subdirs})
+ if(IS_DIRECTORY ${LIB_PACKAGE_FOLDER}/${dir})
+ # message("adding ${LIB_PACKAGE_FOLDER}/${dir}/include/ to the include directories")
+ include_directories("${LIB_PACKAGE_FOLDER}/${dir}/include/")
+ # message("adding ${LIB_PACKAGE_FOLDER}/${dir}/lib[64] to the link directories")
+ link_directories("${LIB_PACKAGE_FOLDER}/${dir}/")
+ link_directories("${LIB_PACKAGE_FOLDER}/${dir}/lib/")
+ link_directories("${LIB_PACKAGE_FOLDER}/${dir}/lib64/")
+ link_directories("${LIB_PACKAGE_FOLDER}/${dir}/bin/")
+ endif()
+ endforeach()
+
+endfunction()
+
+include(FetchContent)
+include(git_describe)
+
+function(sibr_gitlibrary)
+ cmake_parse_arguments(args "" "TARGET;GIT_REPOSITORY;GIT_TAG;ROOT_DIR;SOURCE_DIR" "INCLUDE_DIRS" ${ARGN})
+ if(NOT args_TARGET)
+ message(FATAL "Error on sibr_gitlibrary : please define your target name.")
+ return()
+ endif()
+
+ if(NOT args_ROOT_DIR)
+ set(args_ROOT_DIR ${args_TARGET})
+ endif()
+
+ if(NOT args_SOURCE_DIR)
+ set(args_SOURCE_DIR ${args_TARGET})
+ endif()
+
+ if(args_GIT_REPOSITORY AND args_GIT_TAG)
+ if(EXISTS ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/${args_SOURCE_DIR}/.git)
+ git_describe(
+ PATH ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/${args_SOURCE_DIR}
+ GIT_URL SIBR_GITLIBRARY_URL
+ GIT_BRANCH SIBR_GITLIBRARY_BRANCH
+ GIT_COMMIT_HASH SIBR_GITLIBRARY_COMMIT_HASH
+ GIT_TAG SIBR_GITLIBRARY_TAG
+ )
+
+ if((SIBR_GITLIBRARY_URL STREQUAL args_GIT_REPOSITORY) AND
+ ((SIBR_GITLIBRARY_BRANCH STREQUAL args_GIT_TAG) OR
+ (SIBR_GITLIBRARY_TAG STREQUAL args_GIT_TAG) OR
+ (SIBR_GITLIBRARY_COMMIT_HASH STREQUAL args_GIT_TAG)))
+ message(STATUS "Library ${args_TARGET} already available, skipping.")
+ set(SIBR_GITLIBRARY_DECLARED ON)
+ else()
+ message(STATUS "Adding library ${args_TARGET} from git...")
+ endif()
+ endif()
+
+ FetchContent_Declare(${args_TARGET}
+ GIT_REPOSITORY ${args_GIT_REPOSITORY}
+ GIT_TAG ${args_GIT_TAG}
+ GIT_SHALLOW ON
+ SOURCE_DIR ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/${args_SOURCE_DIR}
+ SUBBUILD_DIR ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/subbuild
+ BINARY_DIR ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/build
+ )
+ FetchContent_GetProperties(${args_TARGET})
+ string(TOLOWER "" lcTargetName)
+
+ if((NOT SIBR_GITLIBRARY_DECLARED) AND (NOT ${lcTargetName}_POPULATED))
+ message(STATUS "Populating library ${args_TARGET}...")
+ FetchContent_Populate(${args_TARGET} QUIET
+ GIT_REPOSITORY ${args_GIT_REPOSITORY}
+ GIT_TAG ${args_GIT_TAG}
+ SOURCE_DIR ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/${args_SOURCE_DIR}
+ SUBBUILD_DIR ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/subbuild
+ BINARY_DIR ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/build
+ )
+ endif()
+
+ add_subdirectory(${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/${args_SOURCE_DIR} ${CMAKE_SOURCE_DIR}/extlibs/${args_ROOT_DIR}/build)
+
+ get_target_property(type ${args_TARGET} TYPE)
+ if(NOT (type STREQUAL "INTERFACE_LIBRARY"))
+ set_target_properties(${args_TARGET} PROPERTIES FOLDER "extlibs")
+ endif()
+
+ list(APPEND ${args_TARGET}_INCLUDE_DIRS ${EXTLIBS_PACKAGE_FOLDER}/${args_ROOT_DIR})
+ list(APPEND ${args_TARGET}_INCLUDE_DIRS ${EXTLIBS_PACKAGE_FOLDER}/${args_ROOT_DIR}/${args_SOURCE_DIR})
+
+ foreach(args_INCLUDE_DIR ${args_INCLUDE_DIRS})
+ list(APPEND ${args_TARGET}_INCLUDE_DIRS ${EXTLIBS_PACKAGE_FOLDER}/${args_ROOT_DIR}/${args_SOURCE_DIR}/${args_INCLUDE_DIR})
+ endforeach()
+
+ include_directories(${${args_TARGET}_INCLUDE_DIRS})
+ else()
+ message(FATAL "Error on sibr_gitlibrary for target ${args_TARGET}: missing git tag or git url.")
+ endif()
+endfunction()
\ No newline at end of file
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/docs/CMakeLists.txt b/extern/sugar/gaussian_splatting/SIBR_viewers/docs/CMakeLists.txt
new file mode 100644
index 0000000..25180dc
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/docs/CMakeLists.txt
@@ -0,0 +1,63 @@
+# Copyright (C) 2020, Inria
+# GRAPHDECO research group, https://team.inria.fr/graphdeco
+# All rights reserved.
+#
+# This software is free for non-commercial, research and evaluation use
+# under the terms of the LICENSE.md file.
+#
+# For inquiries contact sibr@inria.fr and/or George.Drettakis@inria.fr
+
+
+#########################################################
+# Include doxygen documentation target
+#########################################################
+option(BUILD_DOCUMENTATION "build doxygen documentation ('Build' DOCUMENTATION target, and find the compiled docs in install/docs/index.html)" OFF)
+if(BUILD_DOCUMENTATION)
+ set(DOXYGEN_REQUIRED_VERSION "1.8.17")
+ find_package(Doxygen)
+ if(NOT DOXYGEN_FOUND)
+ message(FATAL_ERROR "Doxygen not found, unable to generate documentation.")
+ elseif(DOXYGEN_VERSION VERSION_LESS DOXYGEN_REQUIRED_VERSION)
+ message(FATAL_ERROR "Doxygen version is less than ${DOXYGEN_REQUIRED_VERSION} (Current version is ${DOXYGEN_VERSION}).")
+ else()
+ set(DOXY_DOC_DEST_DIR ${CMAKE_INSTALL_ROOT}/docs) ## used in the doxyfile.in
+
+ set(DOXY_DOC_INPUT_ROOT_DIRS "${CMAKE_HOME_DIRECTORY}/src ${CMAKE_HOME_DIRECTORY}/docs ${CMAKE_CURRENT_BINARY_DIR}/generated") ## used in the doxyfile.in
+ set(DOXY_DOC_EXCLUDE_PATTERNS_DIRS "${DOXY_DOC_EXCLUDE_PATTERNS_DIRS}") ## used in the doxyfile.in
+ set(DOXY_DOC_COMMON_IMG_PATH "${CMAKE_CURRENT_SOURCE_DIR}/img ${CMAKE_HOME_DIRECTORY}/src/projects")
+ set(DOXY_DOC_PAGES_DIR "${CMAKE_CURRENT_SOURCE_DIR}/pages")
+ set(DOXY_DOC_GENERATED_DOC_DIR "${CMAKE_CURRENT_BINARY_DIR}/generated")
+
+ string(REPLACE "\\" "\\\\" SIBR_PROJECTS_SAMPLES_SUBPAGE_REF "${SIBR_PROJECTS_SAMPLES_SUBPAGE_REF}")
+ string(REPLACE "\\" "\\\\" SIBR_PROJECTS_OURS_SUBPAGE_REF "${SIBR_PROJECTS_OURS_SUBPAGE_REF}")
+ string(REPLACE "\\" "\\\\" SIBR_PROJECTS_TOOLBOX_SUBPAGE_REF "${SIBR_PROJECTS_TOOLBOX_SUBPAGE_REF}")
+ string(REPLACE "\\" "\\\\" SIBR_PROJECTS_OTHERS_SUBPAGE_REF "${SIBR_PROJECTS_OTHERS_SUBPAGE_REF}")
+ string(REPLACE "\\" "\\\\" SIBR_PROJECTS_SAMPLES_REF_REF "${SIBR_PROJECTS_SAMPLES_REF_REF}")
+ string(REPLACE "\\" "\\\\" SIBR_PROJECTS_OURS_REF_REF "${SIBR_PROJECTS_OURS_REF_REF}")
+ string(REPLACE "\\" "\\\\" SIBR_PROJECTS_TOOLBOX_REF_REF "${SIBR_PROJECTS_TOOLBOX_REF_REF}")
+ string(REPLACE "\\" "\\\\" SIBR_PROJECTS_OTHERS_REF_REF "${SIBR_PROJECTS_OTHERS_REF_REF}")
+
+ string(REPLACE "\n" "\\n" SIBR_PROJECTS_SAMPLES_SUBPAGE_REF "${SIBR_PROJECTS_SAMPLES_SUBPAGE_REF}")
+ string(REPLACE "\n" "\\n" SIBR_PROJECTS_OURS_SUBPAGE_REF "${SIBR_PROJECTS_OURS_SUBPAGE_REF}")
+ string(REPLACE "\n" "\\n" SIBR_PROJECTS_TOOLBOX_SUBPAGE_REF "${SIBR_PROJECTS_TOOLBOX_SUBPAGE_REF}")
+ string(REPLACE "\n" "\\n" SIBR_PROJECTS_OTHERS_SUBPAGE_REF "${SIBR_PROJECTS_OTHERS_SUBPAGE_REF}")
+ string(REPLACE "\n" "\\n" SIBR_PROJECTS_SAMPLES_REF_REF "${SIBR_PROJECTS_SAMPLES_REF_REF}")
+ string(REPLACE "\n" "\\n" SIBR_PROJECTS_OURS_REF_REF "${SIBR_PROJECTS_OURS_REF_REF}")
+ string(REPLACE "\n" "\\n" SIBR_PROJECTS_TOOLBOX_REF_REF "${SIBR_PROJECTS_TOOLBOX_REF_REF}")
+ string(REPLACE "\n" "\\n" SIBR_PROJECTS_OTHERS_REF_REF "${SIBR_PROJECTS_OTHERS_REF_REF}")
+
+ file(GLOB doxygen_config_files "*.in")
+ foreach(filename ${doxygen_config_files})
+ message(STATUS "Generating ${filename}...")
+ get_filename_component(output_filename ${filename} NAME_WLE)
+ message(STATUS "Output in ${CMAKE_CURRENT_BINARY_DIR}/${output_filename}...")
+ configure_file(${filename} ${CMAKE_CURRENT_BINARY_DIR}/${output_filename} @ONLY)
+ endforeach()
+
+ add_custom_target(DOCUMENTATION ${CMAKE_COMMAND} -P ${CMAKE_CURRENT_BINARY_DIR}/doxygen_prebuild.cmake
+ COMMAND ${DOXYGEN_EXECUTABLE} "${CMAKE_CURRENT_BINARY_DIR}/doxyfile"
+ WORKING_DIRECTORY ${CMAKE_HOME_DIRECTORY}
+ COMMENT "Building user's documentation into ${DOXY_DOC_DEST_DIR} dir..."
+ )
+ endif()
+endif()
diff --git a/extern/sugar/gaussian_splatting/SIBR_viewers/docs/doxyfile.in b/extern/sugar/gaussian_splatting/SIBR_viewers/docs/doxyfile.in
new file mode 100644
index 0000000..86d08ab
--- /dev/null
+++ b/extern/sugar/gaussian_splatting/SIBR_viewers/docs/doxyfile.in
@@ -0,0 +1,2571 @@
+# Doxyfile 1.8.20
+
+# This file describes the settings to be used by the documentation system
+# doxygen (www.doxygen.org) for a project.
+#
+# All text after a double hash (##) is considered a comment and is placed in
+# front of the TAG it is preceding.
+#
+# All text after a single hash (#) is considered a comment and will be ignored.
+# The format is:
+# TAG = value [value, ...]
+# For lists, items can also be appended using:
+# TAG += value [value, ...]
+# Values that contain spaces should be placed between quotes (\" \").
+
+#---------------------------------------------------------------------------
+# Project related configuration options
+#---------------------------------------------------------------------------
+
+# This tag specifies the encoding used for all characters in the configuration
+# file that follow. The default is UTF-8 which is also the encoding used for all
+# text before the first occurrence of this tag. Doxygen uses libiconv (or the
+# iconv built into libc) for the transcoding. See
+# https://www.gnu.org/software/libiconv/ for the list of possible encodings.
+# The default value is: UTF-8.
+
+DOXYFILE_ENCODING = UTF-8
+
+# The PROJECT_NAME tag is a single word (or a sequence of words surrounded by
+# double-quotes, unless you are using Doxywizard) that should identify the
+# project for which the documentation is generated. This name is used in the
+# title of most generated pages and in a few other places.
+# The default value is: My Project.
+
+PROJECT_NAME = SIBR
+
+# The PROJECT_NUMBER tag can be used to enter a project or revision number. This
+# could be handy for archiving the generated documentation or if some version
+# control system is used.
+
+PROJECT_NUMBER = @SIBR_CORE_VERSION@
+
+# Using the PROJECT_BRIEF tag one can provide an optional one line description
+# for a project that appears at the top of each page and should give viewer a
+# quick idea about the purpose of the project. Keep the description short.
+
+PROJECT_BRIEF =
+
+# With the PROJECT_LOGO tag one can specify a logo or an icon that is included
+# in the documentation. The maximum height of the logo should not exceed 55
+# pixels and the maximum width should not exceed 200 pixels. Doxygen will copy
+# the logo to the output directory.
+
+PROJECT_LOGO =
+
+# The OUTPUT_DIRECTORY tag is used to specify the (relative or absolute) path
+# into which the generated documentation will be written. If a relative path is
+# entered, it will be relative to the location where doxygen was started. If
+# left blank the current directory will be used.
+
+OUTPUT_DIRECTORY = @DOXY_DOC_DEST_DIR@
+
+# If the CREATE_SUBDIRS tag is set to YES then doxygen will create 4096 sub-
+# directories (in 2 levels) under the output directory of each output format and
+# will distribute the generated files over these directories. Enabling this
+# option can be useful when feeding doxygen a huge amount of source files, where
+# putting all generated files in the same directory would otherwise causes
+# performance problems for the file system.
+# The default value is: NO.
+
+CREATE_SUBDIRS = NO
+
+# If the ALLOW_UNICODE_NAMES tag is set to YES, doxygen will allow non-ASCII
+# characters to appear in the names of generated files. If set to NO, non-ASCII
+# characters will be escaped, for example _xE3_x81_x84 will be used for Unicode
+# U+3044.
+# The default value is: NO.
+
+ALLOW_UNICODE_NAMES = NO
+
+# The OUTPUT_LANGUAGE tag is used to specify the language in which all
+# documentation generated by doxygen is written. Doxygen will use this
+# information to generate all constant output in the proper language.
+# Possible values are: Afrikaans, Arabic, Armenian, Brazilian, Catalan, Chinese,
+# Chinese-Traditional, Croatian, Czech, Danish, Dutch, English (United States),
+# Esperanto, Farsi (Persian), Finnish, French, German, Greek, Hungarian,
+# Indonesian, Italian, Japanese, Japanese-en (Japanese with English messages),
+# Korean, Korean-en (Korean with English messages), Latvian, Lithuanian,
+# Macedonian, Norwegian, Persian (Farsi), Polish, Portuguese, Romanian, Russian,
+# Serbian, Serbian-Cyrillic, Slovak, Slovene, Spanish, Swedish, Turkish,
+# Ukrainian and Vietnamese.
+# The default value is: English.
+
+OUTPUT_LANGUAGE = English
+
+# The OUTPUT_TEXT_DIRECTION tag is used to specify the direction in which all
+# documentation generated by doxygen is written. Doxygen will use this
+# information to generate all generated output in the proper direction.
+# Possible values are: None, LTR, RTL and Context.
+# The default value is: None.
+
+OUTPUT_TEXT_DIRECTION = None
+
+# If the BRIEF_MEMBER_DESC tag is set to YES, doxygen will include brief member
+# descriptions after the members that are listed in the file and class
+# documentation (similar to Javadoc). Set to NO to disable this.
+# The default value is: YES.
+
+BRIEF_MEMBER_DESC = YES
+
+# If the REPEAT_BRIEF tag is set to YES, doxygen will prepend the brief
+# description of a member or function before the detailed description
+#
+# Note: If both HIDE_UNDOC_MEMBERS and BRIEF_MEMBER_DESC are set to NO, the
+# brief descriptions will be completely suppressed.
+# The default value is: YES.
+
+REPEAT_BRIEF = YES
+
+# This tag implements a quasi-intelligent brief description abbreviator that is
+# used to form the text in various listings. Each string in this list, if found
+# as the leading text of the brief description, will be stripped from the text
+# and the result, after processing the whole list, is used as the annotated
+# text. Otherwise, the brief description is used as-is. If left blank, the
+# following values are used ($name is automatically replaced with the name of
+# the entity):The $name class, The $name widget, The $name file, is, provides,
+# specifies, contains, represents, a, an and the.
+
+ABBREVIATE_BRIEF =
+
+# If the ALWAYS_DETAILED_SEC and REPEAT_BRIEF tags are both set to YES then
+# doxygen will generate a detailed section even if there is only a brief
+# description.
+# The default value is: NO.
+
+ALWAYS_DETAILED_SEC = NO
+
+# If the INLINE_INHERITED_MEMB tag is set to YES, doxygen will show all
+# inherited members of a class in the documentation of that class as if those
+# members were ordinary class members. Constructors, destructors and assignment
+# operators of the base classes will not be shown.
+# The default value is: NO.
+
+INLINE_INHERITED_MEMB = NO
+
+# If the FULL_PATH_NAMES tag is set to YES, doxygen will prepend the full path
+# before files name in the file list and in the header files. If set to NO the
+# shortest path that makes the file name unique will be used
+# The default value is: YES.
+
+FULL_PATH_NAMES = YES
+
+# The STRIP_FROM_PATH tag can be used to strip a user-defined part of the path.
+# Stripping is only done if one of the specified strings matches the left-hand
+# part of the path. The tag can be used to show relative paths in the file list.
+# If left blank the directory from which doxygen is run is used as the path to
+# strip.
+#
+# Note that you can specify absolute paths here, but also relative paths, which
+# will be relative from the directory where doxygen is started.
+# This tag requires that the tag FULL_PATH_NAMES is set to YES.
+
+STRIP_FROM_PATH =
+
+# The STRIP_FROM_INC_PATH tag can be used to strip a user-defined part of the
+# path mentioned in the documentation of a class, which tells the reader which
+# header file to include in order to use a class. If left blank only the name of
+# the header file containing the class definition is used. Otherwise one should
+# specify the list of include paths that are normally passed to the compiler
+# using the -I flag.
+
+STRIP_FROM_INC_PATH =
+
+# If the SHORT_NAMES tag is set to YES, doxygen will generate much shorter (but
+# less readable) file names. This can be useful is your file systems doesn't
+# support long names like on DOS, Mac, or CD-ROM.
+# The default value is: NO.
+
+SHORT_NAMES = NO
+
+# If the JAVADOC_AUTOBRIEF tag is set to YES then doxygen will interpret the
+# first line (until the first dot) of a Javadoc-style comment as the brief
+# description. If set to NO, the Javadoc-style will behave just like regular Qt-
+# style comments (thus requiring an explicit @brief command for a brief
+# description.)
+# The default value is: NO.
+
+JAVADOC_AUTOBRIEF = YES
+
+# If the JAVADOC_BANNER tag is set to YES then doxygen will interpret a line
+# such as
+# /***************
+# as being the beginning of a Javadoc-style comment "banner". If set to NO, the
+# Javadoc-style will behave just like regular comments and it will not be
+# interpreted by doxygen.
+# The default value is: NO.
+
+JAVADOC_BANNER = NO
+
+# If the QT_AUTOBRIEF tag is set to YES then doxygen will interpret the first
+# line (until the first dot) of a Qt-style comment as the brief description. If
+# set to NO, the Qt-style will behave just like regular Qt-style comments (thus
+# requiring an explicit \brief command for a brief description.)
+# The default value is: NO.
+
+QT_AUTOBRIEF = YES
+
+# The MULTILINE_CPP_IS_BRIEF tag can be set to YES to make doxygen treat a
+# multi-line C++ special comment block (i.e. a block of //! or /// comments) as
+# a brief description. This used to be the default behavior. The new default is
+# to treat a multi-line C++ comment block as a detailed description. Set this
+# tag to YES if you prefer the old behavior instead.
+#
+# Note that setting this tag to YES also means that rational rose comments are
+# not recognized any more.
+# The default value is: NO.
+
+MULTILINE_CPP_IS_BRIEF = NO
+
+# By default Python docstrings are displayed as preformatted text and doxygen's
+# special commands cannot be used. By setting PYTHON_DOCSTRING to NO the
+# doxygen's special commands can be used and the contents of the docstring
+# documentation blocks is shown as doxygen documentation.
+# The default value is: YES.
+
+PYTHON_DOCSTRING = YES
+
+# If the INHERIT_DOCS tag is set to YES then an undocumented member inherits the
+# documentation from any documented member that it re-implements.
+# The default value is: YES.
+
+INHERIT_DOCS = YES
+
+# If the SEPARATE_MEMBER_PAGES tag is set to YES then doxygen will produce a new
+# page for each member. If set to NO, the documentation of a member will be part
+# of the file/class/namespace that contains it.
+# The default value is: NO.
+
+SEPARATE_MEMBER_PAGES = NO
+
+# The TAB_SIZE tag can be used to set the number of spaces in a tab. Doxygen
+# uses this value to replace tabs by spaces in code fragments.
+# Minimum value: 1, maximum value: 16, default value: 4.
+
+TAB_SIZE = 4
+
+# This tag can be used to specify a number of aliases that act as commands in
+# the documentation. An alias has the form:
+# name=value
+# For example adding
+# "sideeffect=@par Side Effects:\n"
+# will allow you to put the command \sideeffect (or @sideeffect) in the
+# documentation, which will result in a user-defined paragraph with heading
+# "Side Effects:". You can put \n's in the value part of an alias to insert
+# newlines (in the resulting output). You can put ^^ in the value part of an
+# alias to insert a newline as if a physical newline was in the original file.
+# When you need a literal { or } or , in the value part of an alias you have to
+# escape them by means of a backslash (\), this can lead to conflicts with the
+# commands \{ and \} for these it is advised to use the version @{ and @} or use
+# a double escape (\\{ and \\})
+
+ALIASES =
+
+# Set the OPTIMIZE_OUTPUT_FOR_C tag to YES if your project consists of C sources
+# only. Doxygen will then generate output that is more tailored for C. For
+# instance, some of the names that are used will be different. The list of all
+# members will be omitted, etc.
+# The default value is: NO.
+
+OPTIMIZE_OUTPUT_FOR_C = NO
+
+# Set the OPTIMIZE_OUTPUT_JAVA tag to YES if your project consists of Java or
+# Python sources only. Doxygen will then generate output that is more tailored
+# for that language. For instance, namespaces will be presented as packages,
+# qualified scopes will look different, etc.
+# The default value is: NO.
+
+OPTIMIZE_OUTPUT_JAVA = NO
+
+# Set the OPTIMIZE_FOR_FORTRAN tag to YES if your project consists of Fortran
+# sources. Doxygen will then generate output that is tailored for Fortran.
+# The default value is: NO.
+
+OPTIMIZE_FOR_FORTRAN = NO
+
+# Set the OPTIMIZE_OUTPUT_VHDL tag to YES if your project consists of VHDL
+# sources. Doxygen will then generate output that is tailored for VHDL.
+# The default value is: NO.
+
+OPTIMIZE_OUTPUT_VHDL = NO
+
+# Set the OPTIMIZE_OUTPUT_SLICE tag to YES if your project consists of Slice
+# sources only. Doxygen will then generate output that is more tailored for that
+# language. For instance, namespaces will be presented as modules, types will be
+# separated into more groups, etc.
+# The default value is: NO.
+
+OPTIMIZE_OUTPUT_SLICE = NO
+
+# Doxygen selects the parser to use depending on the extension of the files it
+# parses. With this tag you can assign which parser to use for a given
+# extension. Doxygen has a built-in mapping, but you can override or extend it
+# using this tag. The format is ext=language, where ext is a file extension, and
+# language is one of the parsers supported by doxygen: IDL, Java, JavaScript,
+# Csharp (C#), C, C++, D, PHP, md (Markdown), Objective-C, Python, Slice, VHDL,
+# Fortran (fixed format Fortran: FortranFixed, free formatted Fortran:
+# FortranFree, unknown formatted Fortran: Fortran. In the later case the parser
+# tries to guess whether the code is fixed or free formatted code, this is the
+# default for Fortran type files). For instance to make doxygen treat .inc files
+# as Fortran files (default is PHP), and .f files as C (default is Fortran),
+# use: inc=Fortran f=C.
+#
+# Note: For files without extension you can use no_extension as a placeholder.
+#
+# Note that for custom extensions you also need to set FILE_PATTERNS otherwise
+# the files are not read by doxygen.
+
+EXTENSION_MAPPING =
+
+# If the MARKDOWN_SUPPORT tag is enabled then doxygen pre-processes all comments
+# according to the Markdown format, which allows for more readable
+# documentation. See https://daringfireball.net/projects/markdown/ for details.
+# The output of markdown processing is further processed by doxygen, so you can
+# mix doxygen, HTML, and XML commands with Markdown formatting. Disable only in
+# case of backward compatibilities issues.
+# The default value is: YES.
+
+MARKDOWN_SUPPORT = YES
+
+# When the TOC_INCLUDE_HEADINGS tag is set to a non-zero value, all headings up
+# to that level are automatically included in the table of contents, even if
+# they do not have an id attribute.
+# Note: This feature currently applies only to Markdown headings.
+# Minimum value: 0, maximum value: 99, default value: 5.
+# This tag requires that the tag MARKDOWN_SUPPORT is set to YES.
+
+TOC_INCLUDE_HEADINGS = 0
+
+# When enabled doxygen tries to link words that correspond to documented
+# classes, or namespaces to their corresponding documentation. Such a link can
+# be prevented in individual cases by putting a % sign in front of the word or
+# globally by setting AUTOLINK_SUPPORT to NO.
+# The default value is: YES.
+
+AUTOLINK_SUPPORT = YES
+
+# If you use STL classes (i.e. std::string, std::vector, etc.) but do not want
+# to include (a tag file for) the STL sources as input, then you should set this
+# tag to YES in order to let doxygen match functions declarations and
+# definitions whose arguments contain STL classes (e.g. func(std::string);
+# versus func(std::string) {}). This also make the inheritance and collaboration
+# diagrams that involve STL classes more complete and accurate.
+# The default value is: NO.
+
+BUILTIN_STL_SUPPORT = YES
+
+# If you use Microsoft's C++/CLI language, you should set this option to YES to
+# enable parsing support.
+# The default value is: NO.
+
+CPP_CLI_SUPPORT = NO
+
+# Set the SIP_SUPPORT tag to YES if your project consists of sip (see:
+# https://www.riverbankcomputing.com/software/sip/intro) sources only. Doxygen
+# will parse them like normal C++ but will assume all classes use public instead
+# of private inheritance when no explicit protection keyword is present.
+# The default value is: NO.
+
+SIP_SUPPORT = NO
+
+# For Microsoft's IDL there are propget and propput attributes to indicate
+# getter and setter methods for a property. Setting this option to YES will make
+# doxygen to replace the get and set methods by a property in the documentation.
+# This will only work if the methods are indeed getting or setting a simple
+# type. If this is not the case, or you want to show the methods anyway, you
+# should set this option to NO.
+# The default value is: YES.
+
+IDL_PROPERTY_SUPPORT = YES
+
+# If member grouping is used in the documentation and the DISTRIBUTE_GROUP_DOC
+# tag is set to YES then doxygen will reuse the documentation of the first
+# member in the group (if any) for the other members of the group. By default
+# all members of a group must be documented explicitly.
+# The default value is: NO.
+
+DISTRIBUTE_GROUP_DOC = YES
+
+# If one adds a struct or class to a group and this option is enabled, then also
+# any nested class or struct is added to the same group. By default this option
+# is disabled and one has to add nested compounds explicitly via \ingroup.
+# The default value is: NO.
+
+GROUP_NESTED_COMPOUNDS = NO
+
+# Set the SUBGROUPING tag to YES to allow class member groups of the same type
+# (for instance a group of public functions) to be put as a subgroup of that
+# type (e.g. under the Public Functions section). Set it to NO to prevent
+# subgrouping. Alternatively, this can be done per class using the
+# \nosubgrouping command.
+# The default value is: YES.
+
+SUBGROUPING = YES
+
+# When the INLINE_GROUPED_CLASSES tag is set to YES, classes, structs and unions
+# are shown inside the group in which they are included (e.g. using \ingroup)
+# instead of on a separate page (for HTML and Man pages) or section (for LaTeX
+# and RTF).
+#
+# Note that this feature does not work in combination with
+# SEPARATE_MEMBER_PAGES.
+# The default value is: NO.
+
+INLINE_GROUPED_CLASSES = NO
+
+# When the INLINE_SIMPLE_STRUCTS tag is set to YES, structs, classes, and unions
+# with only public data fields or simple typedef fields will be shown inline in
+# the documentation of the scope in which they are defined (i.e. file,
+# namespace, or group documentation), provided this scope is documented. If set
+# to NO, structs, classes, and unions are shown on a separate page (for HTML and
+# Man pages) or section (for LaTeX and RTF).
+# The default value is: NO.
+
+INLINE_SIMPLE_STRUCTS = NO
+
+# When TYPEDEF_HIDES_STRUCT tag is enabled, a typedef of a struct, union, or
+# enum is documented as struct, union, or enum with the name of the typedef. So
+# typedef struct TypeS {} TypeT, will appear in the documentation as a struct
+# with name TypeT. When disabled the typedef will appear as a member of a file,
+# namespace, or class. And the struct will be named TypeS. This can typically be
+# useful for C code in case the coding convention dictates that all compound
+# types are typedef'ed and only the typedef is referenced, never the tag name.
+# The default value is: NO.
+
+TYPEDEF_HIDES_STRUCT = NO
+
+# The size of the symbol lookup cache can be set using LOOKUP_CACHE_SIZE. This
+# cache is used to resolve symbols given their name and scope. Since this can be
+# an expensive process and often the same symbol appears multiple times in the
+# code, doxygen keeps a cache of pre-resolved symbols. If the cache is too small
+# doxygen will become slower. If the cache is too large, memory is wasted. The
+# cache size is given by this formula: 2^(16+LOOKUP_CACHE_SIZE). The valid range
+# is 0..9, the default is 0, corresponding to a cache size of 2^16=65536
+# symbols. At the end of a run doxygen will report the cache usage and suggest
+# the optimal cache size from a speed point of view.
+# Minimum value: 0, maximum value: 9, default value: 0.
+
+LOOKUP_CACHE_SIZE = 0
+
+# The NUM_PROC_THREADS specifies the number threads doxygen is allowed to use
+# during processing. When set to 0 doxygen will based this on the number of
+# cores available in the system. You can set it explicitly to a value larger
+# than 0 to get more control over the balance between CPU load and processing
+# speed. At this moment only the input processing can be done using multiple
+# threads. Since this is still an experimental feature the default is set to 1,
+# which efficively disables parallel processing. Please report any issues you
+# encounter. Generating dot graphs in parallel is controlled by the
+# DOT_NUM_THREADS setting.
+# Minimum value: 0, maximum value: 32, default value: 1.
+
+NUM_PROC_THREADS = 1
+
+#---------------------------------------------------------------------------
+# Build related configuration options
+#---------------------------------------------------------------------------
+
+# If the EXTRACT_ALL tag is set to YES, doxygen will assume all entities in
+# documentation are documented, even if no documentation was available. Private
+# class members and static file members will be hidden unless the
+# EXTRACT_PRIVATE respectively EXTRACT_STATIC tags are set to YES.
+# Note: This will also disable the warnings about undocumented members that are
+# normally produced when WARNINGS is set to YES.
+# The default value is: NO.
+
+EXTRACT_ALL = YES
+
+# If the EXTRACT_PRIVATE tag is set to YES, all private members of a class will
+# be included in the documentation.
+# The default value is: NO.
+
+EXTRACT_PRIVATE = YES
+
+# If the EXTRACT_PRIV_VIRTUAL tag is set to YES, documented private virtual
+# methods of a class will be included in the documentation.
+# The default value is: NO.
+
+EXTRACT_PRIV_VIRTUAL = NO
+
+# If the EXTRACT_PACKAGE tag is set to YES, all members with package or internal
+# scope will be included in the documentation.
+# The default value is: NO.
+
+EXTRACT_PACKAGE = YES
+
+# If the EXTRACT_STATIC tag is set to YES, all static members of a file will be
+# included in the documentation.
+# The default value is: NO.
+
+EXTRACT_STATIC = YES
+
+# If the EXTRACT_LOCAL_CLASSES tag is set to YES, classes (and structs) defined
+# locally in source files will be included in the documentation. If set to NO,
+# only classes defined in header files are included. Does not have any effect
+# for Java sources.
+# The default value is: YES.
+
+EXTRACT_LOCAL_CLASSES = YES
+
+# This flag is only useful for Objective-C code. If set to YES, local methods,
+# which are defined in the implementation section but not in the interface are
+# included in the documentation. If set to NO, only methods in the interface are
+# included.
+# The default value is: NO.
+
+EXTRACT_LOCAL_METHODS = NO
+
+# If this flag is set to YES, the members of anonymous namespaces will be
+# extracted and appear in the documentation as a namespace called
+# 'anonymous_namespace{file}', where file will be replaced with the base name of
+# the file that contains the anonymous namespace. By default anonymous namespace
+# are hidden.
+# The default value is: NO.
+
+EXTRACT_ANON_NSPACES = NO
+
+# If the HIDE_UNDOC_MEMBERS tag is set to YES, doxygen will hide all
+# undocumented members inside documented classes or files. If set to NO these
+# members will be included in the various overviews, but no documentation
+# section is generated. This option has no effect if EXTRACT_ALL is enabled.
+# The default value is: NO.
+
+HIDE_UNDOC_MEMBERS = NO
+
+# If the HIDE_UNDOC_CLASSES tag is set to YES, doxygen will hide all
+# undocumented classes that are normally visible in the class hierarchy. If set
+# to NO, these classes will be included in the various overviews. This option
+# has no effect if EXTRACT_ALL is enabled.
+# The default value is: NO.
+
+HIDE_UNDOC_CLASSES = NO
+
+# If the HIDE_FRIEND_COMPOUNDS tag is set to YES, doxygen will hide all friend
+# declarations. If set to NO, these declarations will be included in the
+# documentation.
+# The default value is: NO.
+
+HIDE_FRIEND_COMPOUNDS = NO
+
+# If the HIDE_IN_BODY_DOCS tag is set to YES, doxygen will hide any
+# documentation blocks found inside the body of a function. If set to NO, these
+# blocks will be appended to the function's detailed documentation block.
+# The default value is: NO.
+
+HIDE_IN_BODY_DOCS = YES
+
+# The INTERNAL_DOCS tag determines if documentation that is typed after a
+# \internal command is included. If the tag is set to NO then the documentation
+# will be excluded. Set it to YES to include the internal documentation.
+# The default value is: NO.
+
+INTERNAL_DOCS = NO
+
+# If the CASE_SENSE_NAMES tag is set to NO then doxygen will only generate file
+# names in lower-case letters. If set to YES, upper-case letters are also
+# allowed. This is useful if you have classes or files whose names only differ
+# in case and if your file system supports case sensitive file names. Windows
+# (including Cygwin) and Mac users are advised to set this option to NO.
+# The default value is: system dependent.
+
+CASE_SENSE_NAMES = YES
+
+# If the HIDE_SCOPE_NAMES tag is set to NO then doxygen will show members with
+# their full class and namespace scopes in the documentation. If set to YES, the
+# scope will be hidden.
+# The default value is: NO.
+
+HIDE_SCOPE_NAMES = NO
+
+# If the HIDE_COMPOUND_REFERENCE tag is set to NO (default) then doxygen will
+# append additional text to a page's title, such as Class Reference. If set to
+# YES the compound reference will be hidden.
+# The default value is: NO.
+
+HIDE_COMPOUND_REFERENCE= NO
+
+# If the SHOW_INCLUDE_FILES tag is set to YES then doxygen will put a list of
+# the files that are included by a file in the documentation of that file.
+# The default value is: YES.
+
+SHOW_INCLUDE_FILES = YES
+
+# If the SHOW_GROUPED_MEMB_INC tag is set to YES then Doxygen will add for each
+# grouped member an include statement to the documentation, telling the reader
+# which file to include in order to use the member.
+# The default value is: NO.
+
+SHOW_GROUPED_MEMB_INC = NO
+
+# If the FORCE_LOCAL_INCLUDES tag is set to YES then doxygen will list include
+# files with double quotes in the documentation rather than with sharp brackets.
+# The default value is: NO.
+
+FORCE_LOCAL_INCLUDES = NO
+
+# If the INLINE_INFO tag is set to YES then a tag [inline] is inserted in the
+# documentation for inline members.
+# The default value is: YES.
+
+INLINE_INFO = YES
+
+# If the SORT_MEMBER_DOCS tag is set to YES then doxygen will sort the
+# (detailed) documentation of file and class members alphabetically by member
+# name. If set to NO, the members will appear in declaration order.
+# The default value is: YES.
+
+SORT_MEMBER_DOCS = YES
+
+# If the SORT_BRIEF_DOCS tag is set to YES then doxygen will sort the brief
+# descriptions of file, namespace and class members alphabetically by member
+# name. If set to NO, the members will appear in declaration order. Note that
+# this will also influence the order of the classes in the class list.
+# The default value is: NO.
+
+SORT_BRIEF_DOCS = NO
+
+# If the SORT_MEMBERS_CTORS_1ST tag is set to YES then doxygen will sort the
+# (brief and detailed) documentation of class members so that constructors and
+# destructors are listed first. If set to NO the constructors will appear in the
+# respective orders defined by SORT_BRIEF_DOCS and SORT_MEMBER_DOCS.
+# Note: If SORT_BRIEF_DOCS is set to NO this option is ignored for sorting brief
+# member documentation.
+# Note: If SORT_MEMBER_DOCS is set to NO this option is ignored for sorting
+# detailed member documentation.
+# The default value is: NO.
+
+SORT_MEMBERS_CTORS_1ST = NO
+
+# If the SORT_GROUP_NAMES tag is set to YES then doxygen will sort the hierarchy
+# of group names into alphabetical order. If set to NO the group names will
+# appear in their defined order.
+# The default value is: NO.
+
+SORT_GROUP_NAMES = NO
+
+# If the SORT_BY_SCOPE_NAME tag is set to YES, the class list will be sorted by
+# fully-qualified names, including namespaces. If set to NO, the class list will
+# be sorted only by class name, not including the namespace part.
+# Note: This option is not very useful if HIDE_SCOPE_NAMES is set to YES.
+# Note: This option applies only to the class list, not to the alphabetical
+# list.
+# The default value is: NO.
+
+SORT_BY_SCOPE_NAME = YES
+
+# If the STRICT_PROTO_MATCHING option is enabled and doxygen fails to do proper
+# type resolution of all parameters of a function it will reject a match between
+# the prototype and the implementation of a member function even if there is
+# only one candidate or it is obvious which candidate to choose by doing a
+# simple string match. By disabling STRICT_PROTO_MATCHING doxygen will still
+# accept a match between prototype and implementation in such cases.
+# The default value is: NO.
+
+STRICT_PROTO_MATCHING = NO
+
+# The GENERATE_TODOLIST tag can be used to enable (YES) or disable (NO) the todo
+# list. This list is created by putting \todo commands in the documentation.
+# The default value is: YES.
+
+GENERATE_TODOLIST = YES
+
+# The GENERATE_TESTLIST tag can be used to enable (YES) or disable (NO) the test
+# list. This list is created by putting \test commands in the documentation.
+# The default value is: YES.
+
+GENERATE_TESTLIST = YES
+
+# The GENERATE_BUGLIST tag can be used to enable (YES) or disable (NO) the bug
+# list. This list is created by putting \bug commands in the documentation.
+# The default value is: YES.
+
+GENERATE_BUGLIST = YES
+
+# The GENERATE_DEPRECATEDLIST tag can be used to enable (YES) or disable (NO)
+# the deprecated list. This list is created by putting \deprecated commands in
+# the documentation.
+# The default value is: YES.
+
+GENERATE_DEPRECATEDLIST= YES
+
+# The ENABLED_SECTIONS tag can be used to enable conditional documentation
+# sections, marked by \if ... \endif and \cond
+# ... \endcond blocks.
+
+ENABLED_SECTIONS =
+
+# The MAX_INITIALIZER_LINES tag determines the maximum number of lines that the
+# initial value of a variable or macro / define can have for it to appear in the
+# documentation. If the initializer consists of more lines than specified here
+# it will be hidden. Use a value of 0 to hide initializers completely. The
+# appearance of the value of individual variables and macros / defines can be
+# controlled using \showinitializer or \hideinitializer command in the
+# documentation regardless of this setting.
+# Minimum value: 0, maximum value: 10000, default value: 30.
+
+MAX_INITIALIZER_LINES = 30
+
+# Set the SHOW_USED_FILES tag to NO to disable the list of files generated at
+# the bottom of the documentation of classes and structs. If set to YES, the
+# list will mention the files that were used to generate the documentation.
+# The default value is: YES.
+
+SHOW_USED_FILES = YES
+
+# Set the SHOW_FILES tag to NO to disable the generation of the Files page. This
+# will remove the Files entry from the Quick Index and from the Folder Tree View
+# (if specified).
+# The default value is: YES.
+
+SHOW_FILES = YES
+
+# Set the SHOW_NAMESPACES tag to NO to disable the generation of the Namespaces
+# page. This will remove the Namespaces entry from the Quick Index and from the
+# Folder Tree View (if specified).
+# The default value is: YES.
+
+SHOW_NAMESPACES = YES
+
+# The FILE_VERSION_FILTER tag can be used to specify a program or script that
+# doxygen should invoke to get the current version for each file (typically from
+# the version control system). Doxygen will invoke the program by executing (via
+# popen()) the command command input-file, where command is the value of the
+# FILE_VERSION_FILTER tag, and input-file is the name of an input file provided
+# by doxygen. Whatever the program writes to standard output is used as the file
+# version. For an example see the documentation.
+
+FILE_VERSION_FILTER =
+
+# The LAYOUT_FILE tag can be used to specify a layout file which will be parsed
+# by doxygen. The layout file controls the global structure of the generated
+# output files in an output format independent way. To create the layout file
+# that represents doxygen's defaults, run doxygen with the -l option. You can
+# optionally specify a file name after the option, if omitted DoxygenLayout.xml
+# will be used as the name of the layout file.
+#
+# Note that if you run doxygen from a directory containing a file called
+# DoxygenLayout.xml, doxygen will parse it automatically even if the LAYOUT_FILE
+# tag is left empty.
+
+LAYOUT_FILE = @CMAKE_CURRENT_BINARY_DIR@/layout.xml
+
+# The CITE_BIB_FILES tag can be used to specify one or more bib files containing
+# the reference definitions. This must be a list of .bib files. The .bib
+# extension is automatically appended if omitted. This requires the bibtex tool
+# to be installed. See also https://en.wikipedia.org/wiki/BibTeX for more info.
+# For LaTeX the style of the bibliography can be controlled using
+# LATEX_BIB_STYLE. To use this feature you need bibtex and perl available in the
+# search path. See also \cite for info how to create references.
+
+CITE_BIB_FILES =
+
+#---------------------------------------------------------------------------
+# Configuration options related to warning and progress messages
+#---------------------------------------------------------------------------
+
+# The QUIET tag can be used to turn on/off the messages that are generated to
+# standard output by doxygen. If QUIET is set to YES this implies that the
+# messages are off.
+# The default value is: NO.
+
+QUIET = NO
+
+# The WARNINGS tag can be used to turn on/off the warning messages that are
+# generated to standard error (stderr) by doxygen. If WARNINGS is set to YES
+# this implies that the warnings are on.
+#
+# Tip: Turn warnings on while writing the documentation.
+# The default value is: YES.
+
+WARNINGS = YES
+
+# If the WARN_IF_UNDOCUMENTED tag is set to YES then doxygen will generate
+# warnings for undocumented members. If EXTRACT_ALL is set to YES then this flag
+# will automatically be disabled.
+# The default value is: YES.
+
+WARN_IF_UNDOCUMENTED = NO
+
+# If the WARN_IF_DOC_ERROR tag is set to YES, doxygen will generate warnings for
+# potential errors in the documentation, such as not documenting some parameters
+# in a documented function, or documenting parameters that don't exist or using
+# markup commands wrongly.
+# The default value is: YES.
+
+WARN_IF_DOC_ERROR = YES
+
+# This WARN_NO_PARAMDOC option can be enabled to get warnings for functions that
+# are documented, but have no documentation for their parameters or return
+# value. If set to NO, doxygen will only warn about wrong or incomplete
+# parameter documentation, but not about the absence of documentation. If
+# EXTRACT_ALL is set to YES then this flag will automatically be disabled.
+# The default value is: NO.
+
+WARN_NO_PARAMDOC = NO
+
+# If the WARN_AS_ERROR tag is set to YES then doxygen will immediately stop when
+# a warning is encountered.
+# The default value is: NO.
+
+WARN_AS_ERROR = NO
+
+# The WARN_FORMAT tag determines the format of the warning messages that doxygen
+# can produce. The string should contain the $file, $line, and $text tags, which
+# will be replaced by the file and line number from which the warning originated
+# and the warning text. Optionally the format may contain $version, which will
+# be replaced by the version of the file (if it could be obtained via
+# FILE_VERSION_FILTER)
+# The default value is: $file:$line: $text.
+
+WARN_FORMAT = "$file:$line: $text"
+
+# The WARN_LOGFILE tag can be used to specify a file to which warning and error
+# messages should be written. If left blank the output is written to standard
+# error (stderr).
+
+WARN_LOGFILE =
+
+#---------------------------------------------------------------------------
+# Configuration options related to the input files
+#---------------------------------------------------------------------------
+
+# The INPUT tag is used to specify the files and/or directories that contain
+# documented source files. You may enter file names like myfile.cpp or
+# directories like /usr/src/myproject. Separate the files or directories with
+# spaces. See also FILE_PATTERNS and EXTENSION_MAPPING
+# Note: If this tag is empty the current directory is searched.
+
+INPUT = @DOXY_DOC_INPUT_ROOT_DIRS@
+
+# This tag can be used to specify the character encoding of the source files
+# that doxygen parses. Internally doxygen uses the UTF-8 encoding. Doxygen uses
+# libiconv (or the iconv built into libc) for the transcoding. See the libiconv
+# documentation (see: https://www.gnu.org/software/libiconv/) for the list of
+# possible encodings.
+# The default value is: UTF-8.
+
+INPUT_ENCODING = UTF-8
+
+# If the value of the INPUT tag contains directories, you can use the
+# FILE_PATTERNS tag to specify one or more wildcard patterns (like *.cpp and
+# *.h) to filter out the source-files in the directories.
+#
+# Note that for custom extensions or not directly supported extensions you also
+# need to set EXTENSION_MAPPING for the extension otherwise the files are not
+# read by doxygen.
+#
+# If left blank the following patterns are tested:*.c, *.cc, *.cxx, *.cpp,
+# *.c++, *.java, *.ii, *.ixx, *.ipp, *.i++, *.inl, *.idl, *.ddl, *.odl, *.h,
+# *.hh, *.hxx, *.hpp, *.h++, *.cs, *.d, *.php, *.php4, *.php5, *.phtml, *.inc,
+# *.m, *.markdown, *.md, *.mm, *.dox (to be provided as doxygen C comment),
+# *.doc (to be provided as doxygen C comment), *.txt (to be provided as doxygen
+# C comment), *.py, *.pyw, *.f90, *.f95, *.f03, *.f08, *.f18, *.f, *.for, *.vhd,
+# *.vhdl, *.ucf, *.qsf and *.ice.
+
+FILE_PATTERNS = *.h \
+ *.hh \
+ *.hpp \
+ *.hxx \
+ *.cpp \
+ *.cxx \
+ *.cc \
+ *.fp \
+ *.vp \
+ *.gp \
+ *.vs \
+ *.fs \
+ *.gs \
+ *.vert \
+ *.frag \
+ *.geom \
+ *.md \
+ *.dox \
+ *.py
+
+# The RECURSIVE tag can be used to specify whether or not subdirectories should
+# be searched for input files as well.
+# The default value is: NO.
+
+RECURSIVE = YES
+
+# The EXCLUDE tag can be used to specify files and/or directories that should be
+# excluded from the INPUT source files. This way you can easily exclude a
+# subdirectory from a directory tree whose root is specified with the INPUT tag.
+#
+# Note that relative paths are relative to the directory from which doxygen is
+# run.
+
+EXCLUDE = @DOXY_DOC_EXCLUDE_PATTERNS_DIRS@
+
+# The EXCLUDE_SYMLINKS tag can be used to select whether or not files or
+# directories that are symbolic links (a Unix file system feature) are excluded
+# from the input.
+# The default value is: NO.
+
+EXCLUDE_SYMLINKS = NO
+
+# If the value of the INPUT tag contains directories, you can use the
+# EXCLUDE_PATTERNS tag to specify one or more wildcard patterns to exclude
+# certain files from those directories.
+#
+# Note that the wildcards are matched against the file with absolute path, so to
+# exclude all test directories for example use the pattern */test/*
+
+EXCLUDE_PATTERNS =
+
+# The EXCLUDE_SYMBOLS tag can be used to specify one or more symbol names
+# (namespaces, classes, functions, etc.) that should be excluded from the
+# output. The symbol name can be a fully qualified name, a word, or if the
+# wildcard * is used, a substring. Examples: ANamespace, AClass,
+# AClass::ANamespace, ANamespace::*Test
+#
+# Note that the wildcards are matched against the file with absolute path, so to
+# exclude all test directories use the pattern */test/*
+
+EXCLUDE_SYMBOLS =
+
+# The EXAMPLE_PATH tag can be used to specify one or more files or directories
+# that contain example code fragments that are included (see the \include
+# command).
+
+EXAMPLE_PATH =
+
+# If the value of the EXAMPLE_PATH tag contains directories, you can use the
+# EXAMPLE_PATTERNS tag to specify one or more wildcard pattern (like *.cpp and
+# *.h) to filter out the source-files in the directories. If left blank all
+# files are included.
+
+EXAMPLE_PATTERNS =
+
+# If the EXAMPLE_RECURSIVE tag is set to YES then subdirectories will be
+# searched for input files to be used with the \include or \dontinclude commands
+# irrespective of the value of the RECURSIVE tag.
+# The default value is: NO.
+
+EXAMPLE_RECURSIVE = NO
+
+# The IMAGE_PATH tag can be used to specify one or more files or directories
+# that contain images that are to be included in the documentation (see the
+# \image command).
+
+IMAGE_PATH = @DOXY_DOC_COMMON_IMG_PATH@ \
+ @DOXY_APP_SPECIFIC_IMG_PATH@
+
+# The INPUT_FILTER tag can be used to specify a program that doxygen should
+# invoke to filter for each input file. Doxygen will invoke the filter program
+# by executing (via popen()) the command:
+#
+#
+#
+# where is the value of the INPUT_FILTER tag, and is the
+# name of an input file. Doxygen will then use the output that the filter
+# program writes to standard output. If FILTER_PATTERNS is specified, this tag
+# will be ignored.
+#
+# Note that the filter must not add or remove lines; it is applied before the
+# code is scanned, but not when the output code is generated. If lines are added
+# or removed, the anchors will not be placed correctly.
+#
+# Note that for custom extensions or not directly supported extensions you also
+# need to set EXTENSION_MAPPING for the extension otherwise the files are not
+# properly processed by doxygen.
+
+INPUT_FILTER =
+
+# The FILTER_PATTERNS tag can be used to specify filters on a per file pattern
+# basis. Doxygen will compare the file name with each pattern and apply the
+# filter if there is a match. The filters are a list of the form: pattern=filter
+# (like *.cpp=my_cpp_filter). See INPUT_FILTER for further information on how
+# filters are used. If the FILTER_PATTERNS tag is empty or if none of the
+# patterns match the file name, INPUT_FILTER is applied.
+#
+# Note that for custom extensions or not directly supported extensions you also
+# need to set EXTENSION_MAPPING for the extension otherwise the files are not
+# properly processed by doxygen.
+
+FILTER_PATTERNS =
+
+# If the FILTER_SOURCE_FILES tag is set to YES, the input filter (if set using
+# INPUT_FILTER) will also be used to filter the input files that are used for
+# producing the source files to browse (i.e. when SOURCE_BROWSER is set to YES).
+# The default value is: NO.
+
+FILTER_SOURCE_FILES = NO
+
+# The FILTER_SOURCE_PATTERNS tag can be used to specify source filters per file
+# pattern. A pattern will override the setting for FILTER_PATTERN (if any) and
+# it is also possible to disable source filtering for a specific pattern using
+# *.ext= (so without naming a filter).
+# This tag requires that the tag FILTER_SOURCE_FILES is set to YES.
+
+FILTER_SOURCE_PATTERNS =
+
+# If the USE_MDFILE_AS_MAINPAGE tag refers to the name of a markdown file that
+# is part of the input, its contents will be placed on the main page
+# (index.html). This can be useful if you have a project on for instance GitHub
+# and want to reuse the introduction page also for the doxygen output.
+
+USE_MDFILE_AS_MAINPAGE =
+
+#---------------------------------------------------------------------------
+# Configuration options related to source browsing
+#---------------------------------------------------------------------------
+
+# If the SOURCE_BROWSER tag is set to YES then a list of source files will be
+# generated. Documented entities will be cross-referenced with these sources.
+#
+# Note: To get rid of all source code in the generated output, make sure that
+# also VERBATIM_HEADERS is set to NO.
+# The default value is: NO.
+
+SOURCE_BROWSER = YES
+
+# Setting the INLINE_SOURCES tag to YES will include the body of functions,
+# classes and enums directly into the documentation.
+# The default value is: NO.
+
+INLINE_SOURCES = NO
+
+# Setting the STRIP_CODE_COMMENTS tag to YES will instruct doxygen to hide any
+# special comment blocks from generated source code fragments. Normal C, C++ and
+# Fortran comments will always remain visible.
+# The default value is: YES.
+
+STRIP_CODE_COMMENTS = YES
+
+# If the REFERENCED_BY_RELATION tag is set to YES then for each documented
+# entity all documented functions referencing it will be listed.
+# The default value is: NO.
+
+REFERENCED_BY_RELATION = YES
+
+# If the REFERENCES_RELATION tag is set to YES then for each documented function
+# all documented entities called/used by that function will be listed.
+# The default value is: NO.
+
+REFERENCES_RELATION = YES
+
+# If the REFERENCES_LINK_SOURCE tag is set to YES and SOURCE_BROWSER tag is set
+# to YES then the hyperlinks from functions in REFERENCES_RELATION and
+# REFERENCED_BY_RELATION lists will link to the source code. Otherwise they will
+# link to the documentation.
+# The default value is: YES.
+
+REFERENCES_LINK_SOURCE = YES
+
+# If SOURCE_TOOLTIPS is enabled (the default) then hovering a hyperlink in the
+# source code will show a tooltip with additional information such as prototype,
+# brief description and links to the definition and documentation. Since this
+# will make the HTML file larger and loading of large files a bit slower, you
+# can opt to disable this feature.
+# The default value is: YES.
+# This tag requires that the tag SOURCE_BROWSER is set to YES.
+
+SOURCE_TOOLTIPS = YES
+
+# If the USE_HTAGS tag is set to YES then the references to source code will
+# point to the HTML generated by the htags(1) tool instead of doxygen built-in
+# source browser. The htags tool is part of GNU's global source tagging system
+# (see https://www.gnu.org/software/global/global.html). You will need version
+# 4.8.6 or higher.
+#
+# To use it do the following:
+# - Install the latest version of global
+# - Enable SOURCE_BROWSER and USE_HTAGS in the configuration file
+# - Make sure the INPUT points to the root of the source tree
+# - Run doxygen as normal
+#
+# Doxygen will invoke htags (and that will in turn invoke gtags), so these
+# tools must be available from the command line (i.e. in the search path).
+#
+# The result: instead of the source browser generated by doxygen, the links to
+# source code will now point to the output of htags.
+# The default value is: NO.
+# This tag requires that the tag SOURCE_BROWSER is set to YES.
+
+USE_HTAGS = NO
+
+# If the VERBATIM_HEADERS tag is set the YES then doxygen will generate a
+# verbatim copy of the header file for each class for which an include is
+# specified. Set to NO to disable this.
+# See also: Section \class.
+# The default value is: YES.
+
+VERBATIM_HEADERS = YES
+
+# If the CLANG_ASSISTED_PARSING tag is set to YES then doxygen will use the
+# clang parser (see: http://clang.llvm.org/) for more accurate parsing at the
+# cost of reduced performance. This can be particularly helpful with template
+# rich C++ code for which doxygen's built-in parser lacks the necessary type
+# information.
+# Note: The availability of this option depends on whether or not doxygen was
+# generated with the -Duse_libclang=ON option for CMake.
+# The default value is: NO.
+
+CLANG_ASSISTED_PARSING = NO
+
+# If clang assisted parsing is enabled you can provide the compiler with command
+# line options that you would normally use when invoking the compiler. Note that
+# the include paths will already be set by doxygen for the files and directories
+# specified with INPUT and INCLUDE_PATH.
+# This tag requires that the tag CLANG_ASSISTED_PARSING is set to YES.
+
+CLANG_OPTIONS =
+
+# If clang assisted parsing is enabled you can provide the clang parser with the
+# path to the directory containing a file called compile_commands.json. This
+# file is the compilation database (see:
+# http://clang.llvm.org/docs/HowToSetupToolingForLLVM.html) containing the
+# options used when the source files were built. This is equivalent to
+# specifying the "-p" option to a clang tool, such as clang-check. These options
+# will then be passed to the parser. Any options specified with CLANG_OPTIONS
+# will be added as well.
+# Note: The availability of this option depends on whether or not doxygen was
+# generated with the -Duse_libclang=ON option for CMake.
+
+CLANG_DATABASE_PATH =
+
+#---------------------------------------------------------------------------
+# Configuration options related to the alphabetical class index
+#---------------------------------------------------------------------------
+
+# If the ALPHABETICAL_INDEX tag is set to YES, an alphabetical index of all
+# compounds will be generated. Enable this if the project contains a lot of
+# classes, structs, unions or interfaces.
+# The default value is: YES.
+
+ALPHABETICAL_INDEX = NO
+
+# The COLS_IN_ALPHA_INDEX tag can be used to specify the number of columns in
+# which the alphabetical index list will be split.
+# Minimum value: 1, maximum value: 20, default value: 5.
+# This tag requires that the tag ALPHABETICAL_INDEX is set to YES.
+
+COLS_IN_ALPHA_INDEX = 5
+
+# In case all classes in a project start with a common prefix, all classes will
+# be put under the same header in the alphabetical index. The IGNORE_PREFIX tag
+# can be used to specify a prefix (or a list of prefixes) that should be ignored
+# while generating the index headers.
+# This tag requires that the tag ALPHABETICAL_INDEX is set to YES.
+
+IGNORE_PREFIX =
+
+#---------------------------------------------------------------------------
+# Configuration options related to the HTML output
+#---------------------------------------------------------------------------
+
+# If the GENERATE_HTML tag is set to YES, doxygen will generate HTML output
+# The default value is: YES.
+
+GENERATE_HTML = YES
+
+# The HTML_OUTPUT tag is used to specify where the HTML docs will be put. If a
+# relative path is entered the value of OUTPUT_DIRECTORY will be put in front of
+# it.
+# The default directory is: html.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_OUTPUT = .
+
+# The HTML_FILE_EXTENSION tag can be used to specify the file extension for each
+# generated HTML page (for example: .htm, .php, .asp).
+# The default value is: .html.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_FILE_EXTENSION = .html
+
+# The HTML_HEADER tag can be used to specify a user-defined HTML header file for
+# each generated HTML page. If the tag is left blank doxygen will generate a
+# standard header.
+#
+# To get valid HTML the header file that includes any scripts and style sheets
+# that doxygen needs, which is dependent on the configuration options used (e.g.
+# the setting GENERATE_TREEVIEW). It is highly recommended to start with a
+# default header using
+# doxygen -w html new_header.html new_footer.html new_stylesheet.css
+# YourConfigFile
+# and then modify the file new_header.html. See also section "Doxygen usage"
+# for information on how to generate the default header that doxygen normally
+# uses.
+# Note: The header is subject to change so you typically have to regenerate the
+# default header when upgrading to a newer version of doxygen. For a description
+# of the possible markers and block names see the documentation.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_HEADER =
+
+# The HTML_FOOTER tag can be used to specify a user-defined HTML footer for each
+# generated HTML page. If the tag is left blank doxygen will generate a standard
+# footer. See HTML_HEADER for more information on how to generate a default
+# footer and what special commands can be used inside the footer. See also
+# section "Doxygen usage" for information on how to generate the default footer
+# that doxygen normally uses.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_FOOTER =
+
+# The HTML_STYLESHEET tag can be used to specify a user-defined cascading style
+# sheet that is used by each HTML page. It can be used to fine-tune the look of
+# the HTML output. If left blank doxygen will generate a default style sheet.
+# See also section "Doxygen usage" for information on how to generate the style
+# sheet that doxygen normally uses.
+# Note: It is recommended to use HTML_EXTRA_STYLESHEET instead of this tag, as
+# it is more robust and this tag (HTML_STYLESHEET) will in the future become
+# obsolete.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_STYLESHEET =
+
+# The HTML_EXTRA_STYLESHEET tag can be used to specify additional user-defined
+# cascading style sheets that are included after the standard style sheets
+# created by doxygen. Using this option one can overrule certain style aspects.
+# This is preferred over using HTML_STYLESHEET since it does not replace the
+# standard style sheet and is therefore more robust against future updates.
+# Doxygen will copy the style sheet files to the output directory.
+# Note: The order of the extra style sheet files is of importance (e.g. the last
+# style sheet in the list overrules the setting of the previous ones in the
+# list). For an example see the documentation.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_EXTRA_STYLESHEET =
+
+# The HTML_EXTRA_FILES tag can be used to specify one or more extra images or
+# other source files which should be copied to the HTML output directory. Note
+# that these files will be copied to the base HTML output directory. Use the
+# $relpath^ marker in the HTML_HEADER and/or HTML_FOOTER files to load these
+# files. In the HTML_STYLESHEET file, use the file name only. Also note that the
+# files will be copied as-is; there are no commands or markers available.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_EXTRA_FILES =
+
+# The HTML_COLORSTYLE_HUE tag controls the color of the HTML output. Doxygen
+# will adjust the colors in the style sheet and background images according to
+# this color. Hue is specified as an angle on a colorwheel, see
+# https://en.wikipedia.org/wiki/Hue for more information. For instance the value
+# 0 represents red, 60 is yellow, 120 is green, 180 is cyan, 240 is blue, 300
+# purple, and 360 is red again.
+# Minimum value: 0, maximum value: 359, default value: 220.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_COLORSTYLE_HUE = 220
+
+# The HTML_COLORSTYLE_SAT tag controls the purity (or saturation) of the colors
+# in the HTML output. For a value of 0 the output will use grayscales only. A
+# value of 255 will produce the most vivid colors.
+# Minimum value: 0, maximum value: 255, default value: 100.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_COLORSTYLE_SAT = 100
+
+# The HTML_COLORSTYLE_GAMMA tag controls the gamma correction applied to the
+# luminance component of the colors in the HTML output. Values below 100
+# gradually make the output lighter, whereas values above 100 make the output
+# darker. The value divided by 100 is the actual gamma applied, so 80 represents
+# a gamma of 0.8, The value 220 represents a gamma of 2.2, and 100 does not
+# change the gamma.
+# Minimum value: 40, maximum value: 240, default value: 80.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_COLORSTYLE_GAMMA = 80
+
+# If the HTML_TIMESTAMP tag is set to YES then the footer of each generated HTML
+# page will contain the date and time when the page was generated. Setting this
+# to YES can help to show when doxygen was last run and thus if the
+# documentation is up to date.
+# The default value is: NO.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_TIMESTAMP = NO
+
+# If the HTML_DYNAMIC_MENUS tag is set to YES then the generated HTML
+# documentation will contain a main index with vertical navigation menus that
+# are dynamically created via JavaScript. If disabled, the navigation index will
+# consists of multiple levels of tabs that are statically embedded in every HTML
+# page. Disable this option to support browsers that do not have JavaScript,
+# like the Qt help browser.
+# The default value is: YES.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_DYNAMIC_MENUS = YES
+
+# If the HTML_DYNAMIC_SECTIONS tag is set to YES then the generated HTML
+# documentation will contain sections that can be hidden and shown after the
+# page has loaded.
+# The default value is: NO.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_DYNAMIC_SECTIONS = NO
+
+# With HTML_INDEX_NUM_ENTRIES one can control the preferred number of entries
+# shown in the various tree structured indices initially; the user can expand
+# and collapse entries dynamically later on. Doxygen will expand the tree to
+# such a level that at most the specified number of entries are visible (unless
+# a fully collapsed tree already exceeds this amount). So setting the number of
+# entries 1 will produce a full collapsed tree by default. 0 is a special value
+# representing an infinite number of entries and will result in a full expanded
+# tree by default.
+# Minimum value: 0, maximum value: 9999, default value: 100.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_INDEX_NUM_ENTRIES = 100
+
+# If the GENERATE_DOCSET tag is set to YES, additional index files will be
+# generated that can be used as input for Apple's Xcode 3 integrated development
+# environment (see: https://developer.apple.com/xcode/), introduced with OSX
+# 10.5 (Leopard). To create a documentation set, doxygen will generate a
+# Makefile in the HTML output directory. Running make will produce the docset in
+# that directory and running make install will install the docset in
+# ~/Library/Developer/Shared/Documentation/DocSets so that Xcode will find it at
+# startup. See https://developer.apple.com/library/archive/featuredarticles/Doxy
+# genXcode/_index.html for more information.
+# The default value is: NO.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+GENERATE_DOCSET = NO
+
+# This tag determines the name of the docset feed. A documentation feed provides
+# an umbrella under which multiple documentation sets from a single provider
+# (such as a company or product suite) can be grouped.
+# The default value is: Doxygen generated docs.
+# This tag requires that the tag GENERATE_DOCSET is set to YES.
+
+DOCSET_FEEDNAME = "Doxygen generated docs"
+
+# This tag specifies a string that should uniquely identify the documentation
+# set bundle. This should be a reverse domain-name style string, e.g.
+# com.mycompany.MyDocSet. Doxygen will append .docset to the name.
+# The default value is: org.doxygen.Project.
+# This tag requires that the tag GENERATE_DOCSET is set to YES.
+
+DOCSET_BUNDLE_ID = org.doxygen.Project
+
+# The DOCSET_PUBLISHER_ID tag specifies a string that should uniquely identify
+# the documentation publisher. This should be a reverse domain-name style
+# string, e.g. com.mycompany.MyDocSet.documentation.
+# The default value is: org.doxygen.Publisher.
+# This tag requires that the tag GENERATE_DOCSET is set to YES.
+
+DOCSET_PUBLISHER_ID = org.doxygen.Publisher
+
+# The DOCSET_PUBLISHER_NAME tag identifies the documentation publisher.
+# The default value is: Publisher.
+# This tag requires that the tag GENERATE_DOCSET is set to YES.
+
+DOCSET_PUBLISHER_NAME = Publisher
+
+# If the GENERATE_HTMLHELP tag is set to YES then doxygen generates three
+# additional HTML index files: index.hhp, index.hhc, and index.hhk. The
+# index.hhp is a project file that can be read by Microsoft's HTML Help Workshop
+# (see: https://www.microsoft.com/en-us/download/details.aspx?id=21138) on
+# Windows.
+#
+# The HTML Help Workshop contains a compiler that can convert all HTML output
+# generated by doxygen into a single compiled HTML file (.chm). Compiled HTML
+# files are now used as the Windows 98 help format, and will replace the old
+# Windows help format (.hlp) on all Windows platforms in the future. Compressed
+# HTML files also contain an index, a table of contents, and you can search for
+# words in the documentation. The HTML workshop also contains a viewer for
+# compressed HTML files.
+# The default value is: NO.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+GENERATE_HTMLHELP = NO
+
+# The CHM_FILE tag can be used to specify the file name of the resulting .chm
+# file. You can add a path in front of the file if the result should not be
+# written to the html output directory.
+# This tag requires that the tag GENERATE_HTMLHELP is set to YES.
+
+CHM_FILE =
+
+# The HHC_LOCATION tag can be used to specify the location (absolute path
+# including file name) of the HTML help compiler (hhc.exe). If non-empty,
+# doxygen will try to run the HTML help compiler on the generated index.hhp.
+# The file has to be specified with full path.
+# This tag requires that the tag GENERATE_HTMLHELP is set to YES.
+
+HHC_LOCATION =
+
+# The GENERATE_CHI flag controls if a separate .chi index file is generated
+# (YES) or that it should be included in the main .chm file (NO).
+# The default value is: NO.
+# This tag requires that the tag GENERATE_HTMLHELP is set to YES.
+
+GENERATE_CHI = NO
+
+# The CHM_INDEX_ENCODING is used to encode HtmlHelp index (hhk), content (hhc)
+# and project file content.
+# This tag requires that the tag GENERATE_HTMLHELP is set to YES.
+
+CHM_INDEX_ENCODING =
+
+# The BINARY_TOC flag controls whether a binary table of contents is generated
+# (YES) or a normal table of contents (NO) in the .chm file. Furthermore it
+# enables the Previous and Next buttons.
+# The default value is: NO.
+# This tag requires that the tag GENERATE_HTMLHELP is set to YES.
+
+BINARY_TOC = NO
+
+# The TOC_EXPAND flag can be set to YES to add extra items for group members to
+# the table of contents of the HTML help documentation and to the tree view.
+# The default value is: NO.
+# This tag requires that the tag GENERATE_HTMLHELP is set to YES.
+
+TOC_EXPAND = NO
+
+# If the GENERATE_QHP tag is set to YES and both QHP_NAMESPACE and
+# QHP_VIRTUAL_FOLDER are set, an additional index file will be generated that
+# can be used as input for Qt's qhelpgenerator to generate a Qt Compressed Help
+# (.qch) of the generated HTML documentation.
+# The default value is: NO.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+GENERATE_QHP = NO
+
+# If the QHG_LOCATION tag is specified, the QCH_FILE tag can be used to specify
+# the file name of the resulting .qch file. The path specified is relative to
+# the HTML output folder.
+# This tag requires that the tag GENERATE_QHP is set to YES.
+
+QCH_FILE =
+
+# The QHP_NAMESPACE tag specifies the namespace to use when generating Qt Help
+# Project output. For more information please see Qt Help Project / Namespace
+# (see: https://doc.qt.io/archives/qt-4.8/qthelpproject.html#namespace).
+# The default value is: org.doxygen.Project.
+# This tag requires that the tag GENERATE_QHP is set to YES.
+
+QHP_NAMESPACE = org.doxygen.Project
+
+# The QHP_VIRTUAL_FOLDER tag specifies the namespace to use when generating Qt
+# Help Project output. For more information please see Qt Help Project / Virtual
+# Folders (see: https://doc.qt.io/archives/qt-4.8/qthelpproject.html#virtual-
+# folders).
+# The default value is: doc.
+# This tag requires that the tag GENERATE_QHP is set to YES.
+
+QHP_VIRTUAL_FOLDER = doc
+
+# If the QHP_CUST_FILTER_NAME tag is set, it specifies the name of a custom
+# filter to add. For more information please see Qt Help Project / Custom
+# Filters (see: https://doc.qt.io/archives/qt-4.8/qthelpproject.html#custom-
+# filters).
+# This tag requires that the tag GENERATE_QHP is set to YES.
+
+QHP_CUST_FILTER_NAME =
+
+# The QHP_CUST_FILTER_ATTRS tag specifies the list of the attributes of the
+# custom filter to add. For more information please see Qt Help Project / Custom
+# Filters (see: https://doc.qt.io/archives/qt-4.8/qthelpproject.html#custom-
+# filters).
+# This tag requires that the tag GENERATE_QHP is set to YES.
+
+QHP_CUST_FILTER_ATTRS =
+
+# The QHP_SECT_FILTER_ATTRS tag specifies the list of the attributes this
+# project's filter section matches. Qt Help Project / Filter Attributes (see:
+# https://doc.qt.io/archives/qt-4.8/qthelpproject.html#filter-attributes).
+# This tag requires that the tag GENERATE_QHP is set to YES.
+
+QHP_SECT_FILTER_ATTRS =
+
+# The QHG_LOCATION tag can be used to specify the location of Qt's
+# qhelpgenerator. If non-empty doxygen will try to run qhelpgenerator on the
+# generated .qhp file.
+# This tag requires that the tag GENERATE_QHP is set to YES.
+
+QHG_LOCATION =
+
+# If the GENERATE_ECLIPSEHELP tag is set to YES, additional index files will be
+# generated, together with the HTML files, they form an Eclipse help plugin. To
+# install this plugin and make it available under the help contents menu in
+# Eclipse, the contents of the directory containing the HTML and XML files needs
+# to be copied into the plugins directory of eclipse. The name of the directory
+# within the plugins directory should be the same as the ECLIPSE_DOC_ID value.
+# After copying Eclipse needs to be restarted before the help appears.
+# The default value is: NO.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+GENERATE_ECLIPSEHELP = NO
+
+# A unique identifier for the Eclipse help plugin. When installing the plugin
+# the directory name containing the HTML and XML files should also have this
+# name. Each documentation set should have its own identifier.
+# The default value is: org.doxygen.Project.
+# This tag requires that the tag GENERATE_ECLIPSEHELP is set to YES.
+
+ECLIPSE_DOC_ID = org.doxygen.Project
+
+# If you want full control over the layout of the generated HTML pages it might
+# be necessary to disable the index and replace it with your own. The
+# DISABLE_INDEX tag can be used to turn on/off the condensed index (tabs) at top
+# of each HTML page. A value of NO enables the index and the value YES disables
+# it. Since the tabs in the index contain the same information as the navigation
+# tree, you can set this option to YES if you also set GENERATE_TREEVIEW to YES.
+# The default value is: NO.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+DISABLE_INDEX = YES
+
+# The GENERATE_TREEVIEW tag is used to specify whether a tree-like index
+# structure should be generated to display hierarchical information. If the tag
+# value is set to YES, a side panel will be generated containing a tree-like
+# index structure (just like the one that is generated for HTML Help). For this
+# to work a browser that supports JavaScript, DHTML, CSS and frames is required
+# (i.e. any modern browser). Windows users are probably better off using the
+# HTML help feature. Via custom style sheets (see HTML_EXTRA_STYLESHEET) one can
+# further fine-tune the look of the index. As an example, the default style
+# sheet generated by doxygen has an example that shows how to put an image at
+# the root of the tree instead of the PROJECT_NAME. Since the tree basically has
+# the same information as the tab index, you could consider setting
+# DISABLE_INDEX to YES when enabling this option.
+# The default value is: NO.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+GENERATE_TREEVIEW = YES
+
+# The ENUM_VALUES_PER_LINE tag can be used to set the number of enum values that
+# doxygen will group on one line in the generated HTML documentation.
+#
+# Note that a value of 0 will completely suppress the enum values from appearing
+# in the overview section.
+# Minimum value: 0, maximum value: 20, default value: 4.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+ENUM_VALUES_PER_LINE = 4
+
+# If the treeview is enabled (see GENERATE_TREEVIEW) then this tag can be used
+# to set the initial width (in pixels) of the frame in which the tree is shown.
+# Minimum value: 0, maximum value: 1500, default value: 250.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+TREEVIEW_WIDTH = 250
+
+# If the EXT_LINKS_IN_WINDOW option is set to YES, doxygen will open links to
+# external symbols imported via tag files in a separate window.
+# The default value is: NO.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+EXT_LINKS_IN_WINDOW = NO
+
+# If the HTML_FORMULA_FORMAT option is set to svg, doxygen will use the pdf2svg
+# tool (see https://github.com/dawbarton/pdf2svg) or inkscape (see
+# https://inkscape.org) to generate formulas as SVG images instead of PNGs for
+# the HTML output. These images will generally look nicer at scaled resolutions.
+# Possible values are: png (the default) and svg (looks nicer but requires the
+# pdf2svg or inkscape tool).
+# The default value is: png.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+HTML_FORMULA_FORMAT = png
+
+# Use this tag to change the font size of LaTeX formulas included as images in
+# the HTML documentation. When you change the font size after a successful
+# doxygen run you need to manually remove any form_*.png images from the HTML
+# output directory to force them to be regenerated.
+# Minimum value: 8, maximum value: 50, default value: 10.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+FORMULA_FONTSIZE = 10
+
+# Use the FORMULA_TRANSPARENT tag to determine whether or not the images
+# generated for formulas are transparent PNGs. Transparent PNGs are not
+# supported properly for IE 6.0, but are supported on all modern browsers.
+#
+# Note that when changing this option you need to delete any form_*.png files in
+# the HTML output directory before the changes have effect.
+# The default value is: YES.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+FORMULA_TRANSPARENT = YES
+
+# The FORMULA_MACROFILE can contain LaTeX \newcommand and \renewcommand commands
+# to create new LaTeX commands to be used in formulas as building blocks. See
+# the section "Including formulas" for details.
+
+FORMULA_MACROFILE =
+
+# Enable the USE_MATHJAX option to render LaTeX formulas using MathJax (see
+# https://www.mathjax.org) which uses client side JavaScript for the rendering
+# instead of using pre-rendered bitmaps. Use this if you do not have LaTeX
+# installed or if you want to formulas look prettier in the HTML output. When
+# enabled you may also need to install MathJax separately and configure the path
+# to it using the MATHJAX_RELPATH option.
+# The default value is: NO.
+# This tag requires that the tag GENERATE_HTML is set to YES.
+
+USE_MATHJAX = NO
+
+# When MathJax is enabled you can set the default output format to be used for
+# the MathJax output. See the MathJax site (see:
+# http://docs.mathjax.org/en/latest/output.html) for more details.
+# Possible values are: HTML-CSS (which is slower, but has the best
+# compatibility), NativeMML (i.e. MathML) and SVG.
+# The default value is: HTML-CSS.
+# This tag requires that the tag USE_MATHJAX is set to YES.
+
+MATHJAX_FORMAT = HTML-CSS
+
+# When MathJax is enabled you need to specify the location relative to the HTML
+# output directory using the MATHJAX_RELPATH option. The destination directory
+# should contain the MathJax.js script. For instance, if the mathjax directory
+# is located at the same level as the HTML output directory, then
+# MATHJAX_RELPATH should be ../mathjax. The default value points to the MathJax
+# Content Delivery Network so you can quickly see the result without installing
+# MathJax. However, it is strongly recommended to install a local copy of
+# MathJax from https://www.mathjax.org before deployment.
+# The default value is: https://cdn.jsdelivr.net/npm/mathjax@2.
+# This tag requires that the tag USE_MATHJAX is set to YES.
+
+MATHJAX_RELPATH = http://cdn.mathjax.org/mathjax/latest
+
+# The MATHJAX_EXTENSIONS tag can be used to specify one or more MathJax
+# extension names that should be enabled during MathJax rendering. For example
+# MATHJAX_EXTENSIONS = TeX/AMSmath TeX/AMSsymbols
+# This tag requires that the tag USE_MATHJAX is set to YES.
+
+MATHJAX_EXTENSIONS =
+
+# The MATHJAX_CODEFILE tag can be used to specify a file with javascript pieces
+# of code that will be used on startup of the MathJax code. See the MathJax site
+# (see: http://docs.mathjax.org/en/latest/output.html) for more details. For an
+# example see the documentation.
+# This tag requires that the tag USE_MATHJAX is set to YES.
+
+MATHJAX_CODEFILE =
+
+# When the SEARCHENGINE tag is enabled doxygen will generate a search box for
+# the HTML output. The underlying search engine uses javascript and DHTML and
+# should work on any modern browser. Note that when using HTML help
+# (GENERATE_HTMLHELP), Qt help (GENERATE_QHP), or docsets (GENERATE_DOCSET)
+# there is already a search function so this one should typically be disabled.
+# For large projects the javascript based search engine can be slow, then
+# enabling SERVER_BASED_SEARCH may provide a better solution. It is possible to
+# search using the keyboard; to jump to the search box use + S
+# (what the is depends on the OS and browser, but it is typically
+# , /