This page last changed on Nov 29, 2006 by ivan@atlassian.com.

Problem

For large backup zip files (bigger than 1GB) OutOfMemoryErrors can occur during restore, even though the maximum heap size is way above this value.

The error will look something like this:

Cause:
javax.servlet.ServletException: Servlet execution threw an exception
at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:275)
caused by: java.lang.OutOfMemoryError
at java.util.zip.ZipFile.open(Native Method)

However, when looking at the system information you will find that there is still a lot of memory available on the heap.

Memory Information:
Total Memory: 2480 MB
Free Memory: 2385 MB
Used Memory: 95 MB

Solution

The problem seems to be a bug in Java. The method java.util.zip.ZipFile.open does not actually use the allocated memory of the heap, it maps the entire zip file into virtual memory outside the heap. If you run into this problem, you should try to reduce your heap size to about 600MB and try the restore again. This seems to accord with the experience of other developers:

if you set a small value for max heap size, it works
correctly, but if you specify too large a value, then
OutOfMemoryErrors occur.

There is no obvious relationship between the max heap size,
the size of the zip file, and the computer's available
memory. With a max heap size less than about 600 MB, errors
never occur. Large than that, and they occur. A 1.2 GB zip
file always opens correctly, but a 1.4 GB one never does (if
the max heap size is larger than 600 MB). I have tested
this on computers with both 256 MB of RAM and 2 GB of RAM,
and the behavior is nearly identical.

Related topics

Allocating more memory

Document generated by Confluence on Dec 03, 2008 15:04