From aca53f4e38f70b90b64e470ee381bda8ad53e7c3 Mon Sep 17 00:00:00 2001 From: Steph Prince <40640337+stephprince@users.noreply.github.com> Date: Thu, 12 Dec 2024 09:15:32 -0800 Subject: [PATCH] update matnwb docs links (#544) --- docs/best_practices/time_series.rst | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/docs/best_practices/time_series.rst b/docs/best_practices/time_series.rst index e3fc490d8..f6b3d2ab8 100644 --- a/docs/best_practices/time_series.rst +++ b/docs/best_practices/time_series.rst @@ -146,7 +146,7 @@ This is especially important when writing NWBFiles that are intended to be uploa For more information about how to enable chunking and compression on your data, consult the :pynwb-docs:`PyNWB tutorial ` or the -`MatNWB instructions `_. +`MatNWB instructions `_. @@ -159,7 +159,7 @@ Data writers can optimize the storage of large data arrays for particular uses b chunk individually. This is especially important when writing NWBFiles that are intended to be uploaded to the :dandi-archive:`DANDI Archive <>` for storage, sharing, and publication. For more information about how to enable compression on your data, consult the :pynwb-docs:`PyNWB tutorial ` or the -`MatNWB instructions `_ +`MatNWB instructions `_ Check functions: :py::meth:`~nwbinspector.checks._nwb_containers.check_large_dataset_compression`, :py::meth:`~nwbinspector.checks._nwb_containers.check_small_dataset_compression`