From: Jesus Aransay <jesus-maria.aransay@unirioja.es>
Dear all,
as a different but related issue to my previous mail, when we try to
input matrices of dimension over 250 * 250 (approx) as elements of
type "iarray" (the IArray constructor is applied to lists,
http://isabelle.in.tum.de/library/HOL/HOL-Library/IArray.html) in
Isabelle (Isabelle/JEdit 2013), by means of the "definition" command,
processing time gets really slow (of the order of minutes) until the
editor crashes or we just give it up. Inputting the matrices directly
in the generated SML code works well.
Is there any way to improve this behavior?
Thanks for any hints,
Jesus
From: Tobias Nipkow <nipkow@in.tum.de>
Jesus, this is a known problem. If you are OK with inputting the matrix on the
ML level, just stick with it. Alternatively you can look at
http://afp.sourceforge.net/browser_info/current/HOL/HOL-Library/Flyspeck-Tame/Arch.html
where some constants are defined at the ML level (in "Archives/*.ML") but
imported into Isabelle (as Tri, Quad etc). This way you may not need to edit any
generated ML code by hand.
Tobias
From: Makarius <makarius@sketis.net>
I have experimented with this a little, see my version of
Random_Binary_Matrix.thy that is included here. This takes about 45s on
my cheap i7 machine.
My first idea was to bypass the concrete syntax parser via the parse
translation, and turn a numeral token directly into some pre-term. It
turned out that parsing alone takes only 30s for your example, but the
remaining "check" phase for type-reconstruction is rather slow (lots of
unification of types in several stages). I had also some refinements of
the syntax translations, to produce less concrete constants (many bounds
with some beta redex, to simulate a local let within the term); another
refinement was to avoid polymorphism in the constants, but neither helped
very much.
So the second approach uses "make_bits" with the "define" function in
Isabelle/ML. Such plain Isabelle/ML operations are less cryptic than
syntax operations. Its remaining slowness is due to generous
re-certification of terms in certain parts of the system for basic
definitions. You will probably face other more serious problems when
working with the huge formal bits by other means, probably also in the
code generator.
Here you see that Isabelle was not built with big data applications in
mind. Scalability to arbitrary large input is never for free: it needs to
be addressed explicitly in the implementation, and usually involves
trade-offs concerning other things. Isabelle is better at ridiculous
generality than ridiculous speed.
Makarius
Random_Binary_Matrix.thy
From: Makarius <makarius@sketis.net>
I can say more when you show me the concrete example, to get a better idea
what is the usual "problem class" that you have here.
Generally, the default inner syntax of Isabelle (for terms and types) is
made as flexible and powerful as feasible, which also means it does not
scale to arbitrarily large input. There is no fundamental problem here,
since that is only the default to get terms into the system, and there are
many other ways. You have already figured out some regular Isabelle/ML
interfaces to do that.
An alternative is to apply some tricks to import "blobs" directly into the
term language, while staying on the surface end-user syntax. This could
work via your own syntax translations, e.g. like this:
term "BIG_THING ''...''"
where ''...'' is an inner string token that is somehow turned into a term.
You can then apply different parser technology to operate on the given
''...'' string. (Technically there is a limit of 64 MB size for such
strings.)
In principle you could also load a file into the term, although that would
pose some questions about file-system access and file versions within the
document model that is underlying the editor. (As first approximation one
would probably just use File.read and say it is "unmanaged" file-system
access, i.e. assuming you don't edit that file while working with the
Prover IDE.)
Makarius
From: Jesus Aransay <jesus-maria.aransay@unirioja.es>
Dear Makarius,
in the attached file you can find a simple example of a list of lists
representing a 250 * 250 matrix, which takes really long time to be
processed in Isabelle 2013 (also in the repository version); actually,
I gave up after a few minutes of processing without success.
We are using these kind of lists, usually with every list wrapped with
the IArray type constructor
(http://isabelle.in.tum.de/library/HOL/HOL-Library/IArray.html) in
order to use IArrays to implement vectors and matrices.
Using ML definitions of this size or even greater takes almost no
time, so as an alternative solution we are directly executing the ML
code and definitions.
Thanks for any hint,
Jesus
Random_Binary_Matrix.thy
Last updated: Nov 21 2024 at 12:39 UTC