Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question: Best Practice for Writing Large Files / Progress Indicator #41

Closed
AdlerJS opened this issue Apr 1, 2022 · 2 comments
Closed

Comments

@AdlerJS
Copy link

AdlerJS commented Apr 1, 2022

@diachedelic: First just want to say thanks for the plugin. It's been a core feature in the app we are building. I do have a question for you and leaving it here in the hopes others may want the same info. In our app we are trying to support large files around 150 MBs and writing those to the device. We pull the file stream from our server and then grab the response data via response.blob(). When we go to write the file to device sometimes it succeeds and other times it fails. I also sometimes see the error mentioned in #9 but not sure that is related. My question is: Is there a recommended way to have more consistent file writes to device when the files exceed 100 MB in size.

Also, is there a way to get the progress of a file write? So that for large files we could display / track the progress to the users?

I'll provide a code sample below:

const writeFileToDevice = async (file, queryClient, callback) => {
  const isNative = Capacitor.isNativePlatform();
  const isOptimisticFile = validate(file.file_id);
  const fileIsAvailable = file?.status?.description === 'Available';

  // optimistic files writes are handled in useCreateVaultFiles when mutation fires
  if (isOptimisticFile || !isNative || !fileIsAvailable) return;

  return new Promise(async (resolve, reject) => {
    const { value } = await Storage.get({ key: file.storage_key });
    const devicePath = `${file.storage_key}-${file.name}`;

    if (!value) {
      const uninterceptedAxiosInstance = axios.create();
      const fileContent = await downloadFileContents({
        fileId: file.file_id,
        ...(file.workspace_id ? { workspaceId: file.workspace_id } : {}),
        queryClient,
      });

      const response = await uninterceptedAxiosInstance(fileContent, {
        method: 'GET',
        responseType: 'blob',
      });

      const blob = await response.data;
      try {
        const filePath = await write_blob({
          path: devicePath,
          blob: blob,
          directory: Directory.Data,
          recursive: true,
        });

        await Storage.set({
          key: file.storage_key,
          value: devicePath,
        });

        if (callback) {
          callback();
        }

        resolve({ storageKey: file.storage_key, filePath });
      } catch (e) {
        reject(e);
      }
    } else {
      resolve({ storageKey: file.storage_key, filePath: value });
    }
  });
};

export default writeFileToDevice;
@AdlerJS AdlerJS changed the title Question: Best Practice for Writing Large Files Question: Best Practice for Writing Large Files / Progress Indicator Apr 1, 2022
@diachedelic
Copy link
Owner

The write_blob function uses fetch to transmit the blob to a local HTTP server, which writes the request body to disk. As of Chrome 95, fetch can take a stream as the body parameter, finally making it possible to monitor the progress of an HTTP request. So yes, it is theoretically possible to have a progress indicator in some browsers.

Regarding stability, it would probably be possible to modify write_blob to accept a stream instead of a Blob, which could help if it was a memory issue. But I am doubtful, because I think browsers automatically purge Blobs to disk when they get large enough to exhaust available memory. Another strategy could be to write the file in, say, 50MB chunks. See #22.

If you can find out why it sometimes fails that would be very helpful. Does it fail on Android, iOS or both?

@AdlerJS
Copy link
Author

AdlerJS commented Apr 8, 2022

@diachedelic - Sorry for the late reply. After spending more time researching the plugin its actually very optimized and wasn't the cause of our issues. Our issues were caused by inconsistent networks to actually get the blob from the server. Once the file was downloaded to the client blob-writer was very fast in writing files around 100 MB to device and very consistent. Closing this issue based on the findings

@AdlerJS AdlerJS closed this as completed Apr 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants