-
-
Notifications
You must be signed in to change notification settings - Fork 147
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
client.createWriteStream results zero length file #181
Comments
Streams are tricky. Are you waiting for it to finish? You could use something like Ideally you'd want to log out all stream errors to see if it's failing - if you could do that it might show you why no data is being written. |
Thanks for reply. The problem is that I don't get any errors and stream seems to end properly.
Maybe I'm just not using it right. Could you provide a working example of how one should use client.createWriteStream with Nextcloud? |
@artturimatias Ok, so I tried with an example and the stream writes, but I don't get the end event at the correct moment. Here's the example I'm using: const fs = require("fs");
const { createClient } = require("webdav");
const endOfStream = require("end-of-stream");
const client = createClient("https://somewebdav.com", {
username: "user",
password: "pass"
});
const readStream = fs.createReadStream("./webdav.jpg");
const writeStream = client.createWriteStream("/webdav.jpg");
readStream.pipe(writeStream);
endOfStream(writeStream, err => {
if (err) {
console.error(err);
return;
}
console.log("Done!");
}); I couldn't run a demo account at nextcloud.com as their service wasn't working. Seems to be that the event comes when the writestream finishes but the request is still running. Need to think of a way to fix that, as it's most likely been around as an issue for some time. The issue is that the stream seems to drain and emit end, but the request is still ongoing. Ideally the stream would only fire the end event once the request has finished. |
Thanks for the example. I tested with your code and I still got a zero length file.
|
Yes in your example the Content-Type is wrong.. it should probably be Webdav-client doesn't set content-type, so it must be getting set in some default manner. You could modify this area to add the header |
Ok, there must be something odd in our nextclooud installation. I tried with try.nextcloud.com and upload worked (headers did not matter, I tried with modified headers and default ones) Sorry that I didn't test earlier with other instance. I need to checkout out how our installation is different from stock installation. Thanks for the help so far. Here is the full code that I used for testing:
|
@perry-mitchell I can confirm that Also changing to Investigating further zero-length issue. |
I just had confirmation that our Nextcloud has load balancer in front of it and that probably caused the stream breakage in my case. |
We had another issue - #147 - which might have played some part in this issue. I know it was mentioned earlier but it might be worth trying again now that it's fixed, as circumstances might have changed. It was released in 3.1.0. |
Tested with 3.2.0 but no change, results still zero length files in our Nextcloud. But like I said, the problem might be the load balancer in our case. |
@artturimatias Please let me know if you discover otherwise. It’s hard to get streaming working perfectly. |
I've noticed the createWriteStream function only works for small files but completely fails for large files (Say over 10mb). I've tested it on my local webdav server so we can rule out the possibility of the problem causing by load balancer. I am suspecting the problem might be from the Axios being called in the fetch function. It seems Axios has a problem when PUT/POST stream to a remote server. |
@Daniel-Pek I have tested with 1KB text file and ~100KB jpeg image, both result in 0KB file on webdav server. And the only webdav server that I'm working with is bigcommerce.com account webdav storage which might be behind load-balancer. So I won't rule out the load balancer causing that problem. Not sure how I can debug this problem. |
@perry-mitchell Another observation is Let me know if/how I can further debug the problem to give you additional information. This issue is a blocker for us. |
Hi guys, I can reproduce the issue with a local setup of Next Cloud too. No idea what happens at this point. Edit: however I can repro on addFileFromReadable = async (readable, fileDefinition) => {
const webDavClient = await this.getClient();
const filePath = getFilePath(fileDefinition);
const writeStream = webDavClient.createWriteStream(filePath);
await readable.pipe(writeStream);
return fileDefinition;
}; and everything is fine if I do a As pointed out by other users, the palliative is to transform the readable into a buffer and then use Edit: btw, big thank you for this lib. I've just finished replacing Meteor Files by Next Cloud using your package, except for this small issue everything worked very smoothly |
Hey there, I also stumbled over this issue. After googling around a bit I found people having issues with a missing I'm running the request against an up to date NextCloud installation which also runs behind a nginx loadbalancer, which I cannot control (it's hosted at uberspace.de). I don't know if that has something to do with it. Best, [1] i.e. axios/axios#961 (comment) |
It's good to note that you can, at least as of v4.0.0-r01, upload a file steam using |
Unfortunately I have the same problem. With the We use the webdav in a backup plugin for the smart home project Currently I solve it via Is there a solution to the problem in the meantime? Here is a link to the project: |
Hey guys, I actually have a work around about this issue and you guys might want to give a go on this.
This did work for me even for a very large file. |
Thanks for your tip. Do you have an example of this? |
@simatec |
Unfortunately, an upload is not possible without fs.readFileSync. |
Hi @Daniel-Pek , what do you mean specifcically by "Use the Node JS stream to pipe the files to that location by using the URL from the step 1"? I don't get why it would work better than the code that exists already there: https://github.com/perry-mitchell/webdav-client/blob/master/source/operations/createStream.ts#L23 |
Some thoughts on 4.0.0, testing towards nextcloud: Uploading files > 2 GB seems to be impossible at the moment.
My use case:
Will sporadically continue my debugging and update this post if I have new information. |
That seems to work fine for now: function writeStream(localFilePath, remoteFilePath) {
try {
fs.stat(localFilePath, (err, data) => {
if (err) throw err;
const readStream = fs.createReadStream(localFilePath);
const writeStream = client.createWriteStream(remoteFilePath, {
headers: { "Content-Length": data.size },
});
readStream.pipe(writeStream);
endOfStream(writeStream, (err) => {
if (err) throw err;
console.log("Done!");
});
});
} catch (error) {
console.error(error);
}
} (Why doesn't it format correctly?) - Fixed 🙂 Thanks @bennigraf ! |
The I'll try to set aside some more testing time regarding uploading large 2GB+ files. |
webdav: 4.10.0, WebDAV server - Apache 2.4.39, by HTTPS, with Initially I also fell into the problem with createWriteStream(...) and zero size files. But later I realized that during the work with stream files upload, the same time I creating another WebDAVClient and that somehow garbles the transfer of previous one. So, instead of creating new clients for each operation within the same fodder I cached first one and reused it with destroying afterwards. It works fine and I do not modify headers. I found that when turned on forensic logging on httpd side and saw two PUT requests for the same file and one of them with 0 content length. Cannot say how they intertwined but caching solved the problem. Also, if you are doing |
This comment was marked as off-topic.
This comment was marked as off-topic.
|
I'm writing a Nextcloud client application with Nodejs and webdav-client.
I have no problems reading contents of the nexcloud server but uploading files with client.createWriteStream creates zero length files.
This is how I'm using client.createWriteStream:
I can upload files with curl or cadaver so this is not a permission problem:
Am I doing something wrong here? Maybe this related to #173?
version: 2.10.0
The text was updated successfully, but these errors were encountered: