I'm trying to define a large set (hundreds/thousands of records) from an .ML
file instead of bloating my .thy
file.
In my .thy
file:
context MyDataset_A
begin
ML_file "MyDataset_generator.ML"
end
In the generated ML file
structure MyDataset_Generator_Set = struct
val _ =
Theory.setup (
Local_Theory.map_background (fn lthy =>
let
val MyDataset_generator_term = @{term
"{{⟪Slabel1, Nat 42⟫, ⟪Slabel2, Bool True⟫}, {⟪Slabel1, Nat 55⟫, ⟪Slabel2, Bool False⟫},
{⟪Slabel1, Nat 42⟫, ⟪Slabel2, Bool True⟫}, {⟪Slabel1, Nat 55⟫, ⟪Slabel2, Bool False⟫}} :: sdatum set"};
val ((_, _), lthy') = Local_Theory.define ((Binding.name "MyDataset_generator", NoSyn),
((Thm.def_binding (Binding.name "MyDataset_generator_def"), []), MyDataset_generator_term)) lthy;
in lthy' end)
)
end;
end;
However, the Ml file gives an error at line where I have {{....}}
I
nner lexical error
Failed to parse term
What is causing this? Is it the use of @{term "...}
with large or nested terms? Is there a better way to load such constants?
Try
ML \<open>
@{term ....}
\<close>
in a theory file…
then you can minimize the term to see where it fails
(BTW, I suspect the issue is ⟫
)
Mathias Fleury said:
(BTW, I suspect the issue is
⟫
)
Thanks for the help, Mathias.
The @{term ...}
approach works fine when I use ML ‹...›
directly in a .thy
file. But when I try to move the same definition into a separate .ML
file (and include it via ML_file
in my theory), I get a lexical error due to the Unicode angle brackets (⟪ ⟫), even though the same term works inline.
Do you happen to know a workaround for this? Or maybe an alternative way to load structured definitions from an external file into the theory context?
Last updated: Apr 18 2025 at 20:21 UTC