From: Gerwin Klein <gerwin.klein@nicta.com.au>
I am currently updating a very large theory from Isabelle2009-1 to Isabelle2009-2. So far this update was less painful than previous one, but I seem to be stuck now:
I am getting various versions of insufficient memory exceptions from polyml-5.3 after the theory has been fully processed at the point (I presume) where polyml is writing the image to disk, possibly while it is doing garbage collection.
A sample output of the log file is (the log file is btw gzipped and the error code is success, but the image is never written):
lemma ... [last lemma of thy]
val it = () : unit
val it = () : unit
ML> Exception- Fail "Insufficient memory" raised
ML>
The platform is polyml-5.3.0_x86-linux
with
ISABELLE_USEDIR_OPTIONS="-M 1 -p 0 -q 0 -v true"
and unchanged standard ML settings.
The machine has 8G of RAM and otherwise works fine for other big developments.
On polyml-5.3.0_x86-darwin, I am getting the same with the same settings. With -M 2, I am getting a C level exception in polyml at the same point with the message to set a breakpoint in a malloc function on stdour/err (but the same exception in the log).
On polyml-5.3.0_x86_64-darwin, I am getting with -M 1 the same log, and this message on std out:
Building CKERNEL ...
poly(5949,0xa0a3e500) malloc: *** mmap(size=134217728) failed (error code=12)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
Finished CKERNEL (0:32:51 elapsed time, 0:32:47 cpu time, factor 0.99)
(possibly the same as the 32bit version, but I've lost the previous output. Can reproduce if needed)
With -M 3 and -M 2, on 64bit Darwin, I am instead getting exceptions about not being able to start another thread (at the same point). Again, the machine has 8GB of RAM and is otherwise rock solid.
The same image builds easily under 32bit and 64bit polyml 5.3.0 (same version) on machines with less than 4GB of memory in Isabelle2009-1.
Any ideas?
I've tried playing around with -H and --mutable/--immutable settings, without success, but I don't have a good guideline what to try.
Cheers,
Gerwin
From: Gerwin Klein <gerwin.klein@nicta.com.au>
One more bit of information:
Tracing memory consumption in an interactive session for the same theory claims memory usage of 1.3GB "real" memory and 2.3GB of used virtual memory (on 32bit poly on MacOS X).
This would be consistent with the theory working fine on much smaller machines before.
Now I have even less of an idea where the error messages are coming from, though.
Cheers,
Gerwin
Last updated: Nov 21 2024 at 12:39 UTC