On 9/9/2021 12:52 AM, Marcus Vinicius Mesquita via ntg-context wrote:
> Dear list,
>
> In the MWE below, ConTeXt lmtx throws an error with the message in the
> title:
>
> \starttext
> \startluacode
>
> local upperlimit = 90000
>
> context.starttabulate({ "|l|l|l|" })
> for i=1,upperlimit do
> context.NC()
> context("word 1")
> context.NC()
> context("word 2")
> context.NC()
> context("word 3")
> context.NC()
> context.AR()
> end
> context.stoptabulate()
>
> \stopluacode
> \stoptext
>
> But it compiles with no problems with MKIV.
>
> context version: 2021.09.06 11:47
>
> How to avoid this with lmtx?
In luatex there is also a max on configured memory usage but because in
luametatex we have larger nodes you hit limits earlier. Also, the memory
allocation in luametatex is somewhat different (larger nodes are
compensated by savings elsewhere so in the end mem usage for this run is
similar).
luatex luametatex
pages 2196 2196
pps/s 8 11
time 278 195
mem 1.1G 1.1G
now, luametatex reports
tex memory > bumping category 'node' succeeded, details:
all=400500000 | ini=0 | max=50000000 | mem=44500000 | min=1000000 |
ptr=-843956 | set=50000000 | stp=500000 | top=43999999
you can bump the the max in the configuration file and also the step. As
you can see, luametatex runs quite a bit faster on this test but that is
due to other differences between the two engines.
The reason why it does run on my machine is that i have this:
\enableexperiments [tabulatesparseskips]
\enableexperiments [tabulateusesize]
this makes tabulate a bit more efficient in terms of node usage so i
stay some 20% below the configured max .. you can try to run with this
option. Of course adding more text will also demand more mem but then
you can always bump the max (in a configuration file in texmf-local or
so; maybe some day i will add a --huge flag to the runner).
Hans
-----------------------------------------------------------------
Hans Hagen | PRAGMA ADE
Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
-----------------------------------------------------------------