Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create workflow example incorporating DeepInterpolation #163

Open
4 tasks
vijayiyer05 opened this issue Apr 20, 2023 · 4 comments
Open
4 tasks

Create workflow example incorporating DeepInterpolation #163

vijayiyer05 opened this issue Apr 20, 2023 · 4 comments

Comments

@vijayiyer05
Copy link
Collaborator

vijayiyer05 commented Apr 20, 2023

Working live script example including the following components:

  • data access optimized for the DANDIHub compute environment (i.e., use a DANDIset via the pre-mounted S3 bucket)
  • use of an additional MATLAB community toolbox (e.g., EXTRACT) with a scientifically meaningful use case
  • scientific & technical narrative (to a level intermediate between the 'quickstart' and 'demo' examples)

Once completed, this example should be included in the newly launched DANDIset live-script-examples library:

  • Submitted for inclusion to DANDIset example live scripts library
@vijayiyer05
Copy link
Collaborator Author

One approach to selecting a DANDIset & scientific workflow might be to reproduce some components of a published usage of the Allen Brain Observatory datasets. This would fit the growing Open Science trend of data rehacking.

@vijayiyer05
Copy link
Collaborator Author

@ehennestad Capturing here our discussion in Boston, DeepInterpolation likely makes sense as the MCT. Given that has model file dependencies, this BOT example would best live in the DeepInterpolation repo (i.e., submit a PR there) as some pretrained models are included there directly.

@ehennestad
Copy link
Collaborator

The DeepInterpolation-MATLAB examples work really well for two-photon calcium data. The Visual Coding Dataset of the Allen Brain Observatory contains raw two photon data which is accessible through the s3 protocol.

v0.9.3 made files from the s3 bucket available in the Brain Observatory Toolbox and preliminary testing shows that its is possible to partially read imaging data from the two-photon video file.

A proposal for a workflow example will download a subset (as these files are very large) of two-photon data and run DeepInterpolation on these data, possible using code from Nansen :

(https://github.com/VervaekeLab/NANSEN/blob/364ab81cda8dab5a2c0600ae98c6a1c841b270e0/code/datatypes/%2Bnansen/%2Bstack/%2Bprocessor/Denoiser.m#L1)

@vijayiyer05
Copy link
Collaborator Author

Good progress to determine the Visual Coding dataset is a good source of raw data. The BOT is very good at the Visual Coding dataset by now 😄

Based on this, let's drop DandiHub from this issue & focus on the integration with an additional MCT, namely DeepInterpolation. It sounds like good value for DeepInterpolation (source of data) and BOT (analysis use case) alike.

@vijayiyer05 vijayiyer05 changed the title Create workflow example incorporating DANDIHub & another MATLAB community toolbox Create workflow example incorporating DeepInterpolation Dec 7, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants