How can we optimize performance when we upload large files to object storage service such as S3? Before we answer this question, let's take a look at why we need to optimize this process. Some files might be larger than a few GBs. It is possible to upload such a large object file directly, but it could take a long time. If the network connection fails in the middle of the upload, we have to start over. A better solution is to slice a large object into smaller parts and upload them independently. After all the parts are uploaded, the object store re-assembles the object from the parts. This process is called
For really large files the S3 multipart upload limits(like 10k parts) can be a problem. You still can upload a 5TB file, but it will be really slow(parts are pretty big).
Any suggestions other than using dedicated products like Aspera?
How to upload a large file to S3?
You may want to refer S3 transfer acceleration for faster uploads
For really large files the S3 multipart upload limits(like 10k parts) can be a problem. You still can upload a 5TB file, but it will be really slow(parts are pretty big).
Any suggestions other than using dedicated products like Aspera?