-
Notifications
You must be signed in to change notification settings - Fork 1.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Cannot upload large files when using chunking #2295
Comments
Are you uploading the files via web or NC client ? I just finished uploading 38 gig file and had 0 issues. Over the last two years I've uploaded terabytes of data via the NC app and had only one issue, and that was a datacenter outage that caused some weird database issues, but after fixing it I still have no problems with large files. Max i've tried is approx 500 gigabytes file and it went okay. |
Unfortunately with both the web and the NC client. The NC client seems to be a tad faster than over the web but still slow. IT does work if i turn of chunking (i.e. set max_chunk_size) to 0 but then the upload speed is next to nothing for a 1.5GB file. If I set it to the default 10MB or even 20MB the upload speed seems to pick up but it fails. |
Using web for large (>5GB) files is always pain due php limits. The NC desktop client should not have any issues with chunked upload. Can you try to revert the change to max-chunk-size and try with the desktop app ? |
Just for my peace of mind, this is also how I attach the NFS share. 192.168.1.139:/mnt/pool01/docker /var/lib/docker/volumes nfs x-systemd.automount,rw,soft,sync 0 0 Basically I mount the entire docker volumes directory to the nfs share. Meaning all volumes are actually created on the share not on the machine where I run docker (and nextcloud) from. I did two tests with the NC Client:
I am now retrying with the 20MB chunking from the desktop and then I will also set it to the default 10MB chunking and try again. |
What is |
That is a really good question and i am not sure. I've looke both in the container, on the machine running docker and on the machine that hosts the volumes, it doesn't exist ? I can't find it anywhere. Als it's not something i'd setup since it doesn't follow my format for naming volumes or anyhting else. https://help.nextcloud.com/t/what-is-mnt-ncdata-for/189789 |
Did you previously use the All-In-One image? I believe it's the default Is it referenced under |
I did indeed try an all-in-one image months ago, however that did not work for me so I removed the container, image and volumes related to it. Running the command you provided results in no data. Even if i don't do the grep for Also cannot seem to find any symbolic links to it |
The way this image works and based on your Compose:
should be:
Do you have any data in your user's home directory within the container - e.g. |
Hi! I've recently set up nextcloud using docker. The OS that docker runs on is Ubuntu 22.04.4 LTS. I provided the docker compose file below. My docker volume is mounted on a sepparate NFS share, therefore all volumes are mounted on the nfs share.
The problem I run into is that when I try to upload larger files, this fails with the error:
[PHP] Error: rmdir(/mnt/ncdata/admin/uploads/web-file-upload-15252b578fdf95d6740fb72e4e6c488a-1687753536511): Directory not empty at /var/www/html/lib/private/Files/Storage/Local.php#147
I managed to find the following link to problem however, there it is specified that the error is in the alpine version, where as I am using the
nextcloud:apache
tag. I was able to have a workaround by turning off chunking with the following command:This seems to allow larger files now, however it has slowed down upload time considerably. Is there any known fix or work-around / other image that I could use to work around this problem?
The text was updated successfully, but these errors were encountered: