Skip to content

Commit

Permalink
update matnwb docs links (#544)
Browse files Browse the repository at this point in the history
  • Loading branch information
stephprince authored Dec 12, 2024
1 parent ad5d73b commit aca53f4
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions docs/best_practices/time_series.rst
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,7 @@ This is especially important when writing NWBFiles that are intended to be uploa

For more information about how to enable chunking and compression on your data, consult the
:pynwb-docs:`PyNWB tutorial <tutorials/advanced_io/h5dataio.html#chunking>` or the
`MatNWB instructions <https://neurodatawithoutborders.github.io/matnwb/tutorials/html/dataPipe.html#2>`_.
`MatNWB instructions <https://matnwb.readthedocs.io/en/latest/pages/tutorials/dataPipe.html>`_.



Expand All @@ -159,7 +159,7 @@ Data writers can optimize the storage of large data arrays for particular uses b
chunk individually. This is especially important when writing NWBFiles that are intended to be uploaded to the
:dandi-archive:`DANDI Archive <>` for storage, sharing, and publication. For more information about how to enable compression on your data, consult the
:pynwb-docs:`PyNWB tutorial <tutorials/advanced_io/h5dataio.html#compression-and-other-i-o-filters>` or the
`MatNWB instructions <https://neurodatawithoutborders.github.io/matnwb/tutorials/html/dataPipe.html#2>`_
`MatNWB instructions <https://matnwb.readthedocs.io/en/latest/pages/tutorials/dataPipe.html>`_

Check functions: :py::meth:`~nwbinspector.checks._nwb_containers.check_large_dataset_compression`,
:py::meth:`~nwbinspector.checks._nwb_containers.check_small_dataset_compression`
Expand Down

0 comments on commit aca53f4

Please sign in to comment.