luatex crash around page 1000
Hi, I have a compilation problem with a data driven document. This is what I get after around 12min (first run)... [..] pages > flushing realpage 991, userpage 970, subpage 991 pages > flushing realpage 992, userpage 971, subpage 992 pages > flushing realpage 993, userpage 972, subpage 993 mtx-context | fatal error: return code: -1073741571 It compiles fine if I just use the first or second "half" of the data. But as soon as I reach a certain amount of pages, it crashes. An older version compiled fine with 958 users pages (977 total). Seems that the data has reached a dimension that causes a resource problem. Any advice? I looked into "texmf.cnf" and "texmfcnf.lua" for possible bottlenecks, but I lack the knowledge what to change. And it simply takes too much (run)time to just play around with some values. Peter
context --version mtx-context | ConTeXt Process Management 1.02 mtx-context | mtx-context | main context file: r:/tex/texmf-context/tex/context/base/mkiv/context.mkiv mtx-context | current version: 2019.04.04 13:31
luatex --version This is LuaTeX, Version 1.10.0 (TeX Live 2019/W32TeX)
On 4/11/2019 12:28 PM, Peter Rolf wrote:
Hi,
I have a compilation problem with a data driven document. This is what I get after around 12min (first run)...
that pretty slow ...
[..] pages > flushing realpage 991, userpage 970, subpage 991 pages > flushing realpage 992, userpage 971, subpage 992 pages > flushing realpage 993, userpage 972, subpage 993 mtx-context | fatal error: return code: -1073741571
It compiles fine if I just use the first or second "half" of the data. But as soon as I reach a certain amount of pages, it crashes. An older version compiled fine with 958 users pages (977 total). Seems that the data has reached a dimension that causes a resource problem.
Any advice? I looked into "texmf.cnf" and "texmfcnf.lua" for possible bottlenecks, but I lack the knowledge what to change. And it simply takes too much (run)time to just play around with some values. you can try to bump values (trial and error) but for some you need to remake the format
does lmtx also has that problem? Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl -----------------------------------------------------------------
On 4/11/2019 12:28 PM, Peter Rolf wrote:
I have a compilation problem with a data driven document. This is what I get after around 12min (first run)...
I'm still puzzled by this 12 min ... so less than one page per second ... do you need to process all that data each run? It's over a decade ago that I could start a run and come back after an enforced break to check. Is there some memory build up (swapping)? Is the speed linear? \enabletrackers[pages.timing] Will show the time spend per page.
[..] pages > flushing realpage 991, userpage 970, subpage 991 pages > flushing realpage 992, userpage 971, subpage 992 pages > flushing realpage 993, userpage 972, subpage 993 mtx-context | fatal error: return code: -1073741571
It compiles fine if I just use the first or second "half" of the data. But as soon as I reach a certain amount of pages, it crashes. An older version compiled fine with 958 users pages (977 total). Seems that the data has reached a dimension that causes a resource problem.
Any advice? I looked into "texmf.cnf" and "texmfcnf.lua" for possible bottlenecks, but I lack the knowledge what to change. And it simply takes too much (run)time to just play around with some values. I'll add a \showusage commands that reports some stats per page so that you can see what grows out of control.
Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl -----------------------------------------------------------------
participants (2)
-
Hans Hagen
-
Peter Rolf