Best practices for your largest file transfers
Factors that affect large file transfers
Large file transfers aren’t always about how much data is in the package. Factors that affect package transfers include:
-
Individual file sizes: MB, GB, TB.
-
Number of files: Single file or thousands of files.
-
Folder structure: Simple or complex structure.
-
Directory contents: Impact of empty directories.
-
Destination and source limits: Daily ingest limits, maximum file size restrictions, storage capacity…
Let’s go through the list and ensure you’re set up for success.
Individual file sizes
You can upload individual files that are up to 15 TB in size. Chances are, the platform won’t be the limiting factor for that kind of file transfer, but it’s best to plan ahead by following the recommendations below.
Recommendations:
- Use the Desktop App or install the Agent to handle very large file transfers.
- Reach out to our Support team about handling large multi-terabyte files. Solutions such as Express can enhance performance.
Number of files in a package
If you’re sending or receiving more that 100,000 files in a package, we recommend that you reach out to our Support team to avoid any errors or disruptions in the transfer. The platform has checkpoints when a package exceeds 100,000 files or 250,000 files, depending on the workflow (basic or more complex, with Portals, integrations, or automations).
Message: File count exceeds {number}. Please contact Support.
Recommendations:
- Use the Desktop App or install the Agent to handle transfers with a large number of files.
- Reach out to our Support team for solutions that best suit your workflow, including if you’re a candidate for Express.
Folder structure
Your original folder structure is maintained, but complex or deep folder structures can cause issues.
Recommendations:
-
Reduce the depth of the structure and avoid long names whenever possible. This helps avoid issues with destination systems which can have limitations on file names or file path length.
Directory contents
Like folder structures, your original directory structure is maintained, but if there are a large number of empty directories (over 5000), an error message will display.
Message: Packages cannot contain more than 5000 empty directories. Zip the package or specific directories and try again.
Recommendations:
- Zip the package or specific directories within the package before uploading.
Destination and source limits
Whenever you send a large file transfer (or any transfer), it's best to know the limitations or requirements that apply to your source or destination. For example:
- Google Drive accounts have a daily ingest limit (750 GB unless otherwise specified). Individual files can be up to 5 TB in size, but rate limits on transfers into Google Drive impact delivery speeds.
- On-prem storage has a wide range of capacities, requirements, and limitations.
- Amazon S3 offers scalable capacity and has a maximum file size of 15 TB, but chunk size/part size needs to be set high enough to accommodate multi-terabyte packages to avoid exceeding the maximum number of parts.
Other factors, such as network bandwidth, disk speed, and software, also need to be considered. Limits, requirements, and other factors apply to both the source and destination storage.
Recommendations:
- Review the documentation for your storage provider so you understand limits and factors that affect performance.
- Use the
- the Desktop App or the Agent to get the best performance and to take advantage of custom or dynamic chunk size adjustment.
- If sending to on-prem, review Network and Storage settings available in the Desktop App and Agent to optimize performance.