Skip to content

Commit

Permalink
docs: README and docs/index revision (#185)
Browse files Browse the repository at this point in the history
* separate out the goal and specific current dev for Echoview

* small wording changes
  • Loading branch information
leewujung authored Mar 27, 2024
1 parent 95cb3e1 commit de7a650
Show file tree
Hide file tree
Showing 2 changed files with 25 additions and 17 deletions.
22 changes: 13 additions & 9 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,34 +5,38 @@

![example workflow](https://github.com/OSOceanAcoustics/echoregions/actions/workflows/pytest.yml/badge.svg)

Echoregions is a tool that interfaces annotations from Echoview and masks for water column sonar data for Machine Learning (ML) developments. Manual annotations from Echoview are widely used in fisheries acoustics community for labeling the presence of different animal species, and the presence of bottoms on echograms. Echoregions is designed to be used as an intermediate software between Echoview annotation data products and conventional Python Machine Learning data products. The end goal for Echoregions is to allow the user to easily go from Echoview -> ML data products, and ML -> Echoview data products. Presently, the Echoview -> ML data products pipeline has been built.
Echoregions is a tool that interfaces annotations of water column sonar data with Machine Learning (ML) models.

The annotations are typically regions indicating the presence of specific animal species or lines delineating ocean boundaries, such as the seafloor or sea surface, in the "echogram" (sonar images formed by echo returns). The interfacing functionalities operate in two directions:
- Annotation to ML: Parsing and organizing manual annotations for preparing training and test datasets for ML developments
- ML to annotation: Generating annotations from ML predictions that can be used for further downstream processing

At present, functionalities in the Annotation to ML direction have been built for interfacing the Echoview software that is widely used in the fisheries acoustics community. We plan to add functionalities in the ML to Annotation direction in the near future.

## Functionalities

As of now, Echoregions contains functions to:
- Read, organize, and store Echoview manual annotations
- Read, organize, and store Echoview manual annotations (regions and lines)
- Create masks by combining the manual annotations and xarray water column sonar datasets generated by [Echopype](https://github.com/OSOceanAcoustics/echopype)

We plan to add additional functions to build the ML -> Echoview data products pipeline. This will allow the user to create bottom and region annotations from ML predictions and convert to a format that can be easily visualized and manipulated in Echoview.

The underlying annotation data is stored as a Pandas dataframe, which allows users to leverage the powerful indexing and computational tools provided by Pandas.
Note that in Echoregions, the underlying annotation data is stored as a Pandas dataframe, which allows users to directly leverage the powerful indexing and computing functionalities provided by Pandas.

## Documentation

Learn more about Echoregions functions in the documentation at https://echoregions.readthedocs.io.

See the [API documentation](https://echoregions.readthedocs.io/en/latest/api.html) for all of the classes and functions available in echoregions.

## Contributors

Echoregions development is currently led by Caesar Tuguinay (@ctuguinay), with inputs from Wu-Jung Lee (@leewujung) and Valentina Staneva (@valentina-s). Kavin Nguyen (@ngkavin) contributed significantly to the initial version.

## Acknowledgement

We thank the NOAA Northwest Fisheries Science Center (NWFSC) Fisheries Engineering and Acoustics Team (FEAT) for supporting this project.

<img src="docs/source/images/noaa_fisheries_logo.png" alt="NOAA_fisheries_logo" width="200">

## Contributors

Echoregions development is currently led by Caesar Tuguinay (@ctuguinay), with inputs from Wu-Jung Lee (@leewujung) and Valentina Staneva (@valentina-s). Kavin Nguyen (@ngkavin) contributed significantly to the initial version.

## License

Echoregions is licensed under the open source [Apache 2.0 license](https://opensource.org/licenses/Apache-2.0).
Expand Down
20 changes: 12 additions & 8 deletions docs/source/index.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,24 @@
# Echoregions

Echoregions is a tool that interfaces annotations from Echoview and masks for water column sonar data for Machine Learning (ML) developments. Manual annotations from Echoview are widely used in fisheries acoustics community for labeling the presence of different animal species, and the presence of bottoms on echograms. Echoregions is designed to be used as an intermediate software between Echoview annotation data products and conventional Python Machine Learning data products. The end goal for Echoregions is to allow the user to easily go from Echoview -> ML data products, and ML -> Echoview data products. Presently, the Echoview -> ML data products pipeline has been built.
Echoregions is a tool that interfaces annotations of water column sonar data with Machine Learning (ML) models.

The annotations are typically regions indicating the presence of specific animal species or lines delineating ocean boundaries, such as the seafloor or sea surface, in the "echogram" (sonar images formed by echo returns). The interfacing functionalities operate in two directions:
- Annotation to ML: Parsing and organizing manual annotations for preparing training and test datasets for ML developments
- ML to Annotation: Generating annotations from ML predictions that can be used for further downstream processing

At present, functionalities in the Annotation to ML direction have been built for interfacing manual annotations from the Echoview software, which is widely used in the fisheries acoustics community. We plan to add functionalities in the ML to Annotation direction in the near future.

## Functionalities

As of now, Echoregions contains functions to:
- Read, organize, and store Echoview manual annotations
- Read, organize, and store Echoview manual annotations (regions and lines)
- Create masks by combining the manual annotations and xarray water column sonar datasets generated by [Echopype](https://github.com/OSOceanAcoustics/echopype)

We plan to add additional functions to build the ML -> Echoview data products pipeline. This will allow the user to create bottom and region annotations from ML predictions and convert to a format that can be easily visualized and manipulated in Echoview.
Note that in Echoregions, the underlying annotation data is stored as a Pandas dataframe, which allows users to directly leverage the powerful indexing and computing functionalities provided by Pandas.

The underlying annotation data is stored as a Pandas dataframe, which allows users to leverage the powerful indexing and computational tools provided by Pandas.
## Contributors

Echoregions development is currently led by Caesar Tuguinay (@ctuguinay), with inputs from Wu-Jung Lee (@leewujung) and Valentina Staneva (@valentina-s). Kavin Nguyen (@ngkavin) contributed significantly to the initial version.

## Acknowledgement

Expand All @@ -22,10 +30,6 @@ We thank the NOAA Northwest Fisheries Science Center (NWFSC) Fisheries Engineeri
```
<!-- <img src="docs/source/images/noaa_fisheries_logo.png" alt="NOAA_fisheries_logo" width="200"> -->

## Contributors

Echoregions development is currently led by Caesar Tuguinay (@ctuguinay), with inputs from Wu-Jung Lee (@leewujung) and Valentina Staneva (@valentina-s). Kavin Nguyen (@ngkavin) contributed significantly to the initial version.

## License

Echoregions is licensed under the open source [Apache 2.0 license](https://opensource.org/licenses/Apache-2.0).
Expand Down

0 comments on commit de7a650

Please sign in to comment.