Alex Meub

S3 Website Out Of Memory Error

I recently migrated my blog to Jekyll and wanted to use the s3_website gem to deploy it. Everything worked out-of-the-box extremely well until I wanted to actually run the s3_website push command to deploy my site to the live S3 bucket.

The command would hang for a really long time and eventually respond with the following error:

java.lang.OutOfMemoryError: GC overhead limit exceeded

After some digging I figured it out. I had turned on S3 logging and built up a logs folder in my bucket with hundreds of thousands of log files in it. Because s3_website tries to list all the resources in the bucket before the deploy, it would just chew up all it’s allocated memory and error out.

The simple fix is to turn off logging in the S3 Bucket properties and delete the logs folder. Problem solved!