This section provides information on highly populated directories and large file management.
Highly populated directories can cause a small decline in performance at around 50,000 files in a single directory, and a significant performance drop at over 100,000 files. Once above 100,000 files, backups may fail, and the filesystem itself may be corrupted leading to data loss and site dwontime.
You can refactor your file structure if you have individual directories with tens of thousands of files (for example, an image repository) to optimize site performance on Pantheon. If refactoring is not a possiblity, you may wish to offload the files to dedicated external filesystem like Amazon S3 or Google Cloud Storage.
To prevent this issue going forward both WordPress and Drupal, can manage uploaded content into different directories based on the date or user, which is preferable to adding all uploads into a single directory. Refactoring an existing large-scale site with this issue is usually a matter of re-arranging the files, then updating the files table in Drupal or WordPress.
A code repository larger than 2GB increases the possibility of Git errors when committing code on Pantheon. Review the options below to improve performance:
- Keep multimedia assets out of the repository by moving files to a media file storage service, such as Amazon S3, and using version control to track URLs.
- Reduce the size of your repository if it is over 2GB and is causing problems (such as errors when cloning).
The Pantheon Filesystem and file serving infrastructure is not optimized to store and deliver large files.
- Files over 50MiB can be uploaded with WordPress, Drupal, or SFTP. You will experience noticeable degradation in performance.
- Files over 100MiB cannot be uploaded through WordPress or Drupal. You must add files of this size by SFTP or rsync.
- Files over 256MiB are not supported and cannot be stored on the Pantheon Filesystem.
|File Size||Platform Compatibility||Notes|
|≤ 100MiB||✔||Can be uploaded via any means|
|100MiB - 256MiB||✔||Must be uploaded over SFTP or rsync|
|> 256MiB||❌||Must be hosted via 3rd-party CDN|
We recommend using a CDN like Amazon S3 as a cost-effective file serving solution that allows uploads directly to S3 from your site without using Pantheon as an intermediary if you are distributing large binaries or hosting big media files.
- Drupal sites can use a module such as S3 File System.
- WordPress sites can use plugins such as S3 Uploads or WP Offload Media.
You cannot upload files over 100MiB through the CMS even when using an external CDN to host files. You can upload these files directly to the CDN. Refer to Amazon's documentation for uploading to an S3 bucket for more information.
Uploading large files over a slow local internet connection can cause the process to hit our Connection Timeout of 59 seconds. For example, a 10MiB file uploaded on a 2Mbps connection may take too long and fail. You can use an upload time calculator like this one to help determine if your local internet connection is impeding file uploads to Pantheon.
Large backups take longer, use more resources, and have a higher likelihood of failing. A 100GiB compressed tarball is not a convenient solution. Sites with footprints over 200GiB or two million files cannot be backed up for this reason (although code and database are backed up as normal).
Despite the lack of backups, file content is highly durable and stored on multiple servers.