Hi, Here are the highlights of todays update: - somewhat more compact tuc files, not for all documents, but it can accumulate; also less memory used then; i could bring down an extreme 2000 page 5 column doc tuc file down to 5% -- it was 70 MB; for the luametatex manual it reducec the tuc more than 30%; hard to tell if there will be an associated performance hit, but i'm sure thomas will complain if that's the case - more mp-tex-lua interfacing upgraded plus extra preliminary chapter for luametafun about extensions - for taco - a slightly more compact cache files for fonts with many (pseudo) ligatures; hopefully no side effects (nothing that can't be fixed fast if noted); quite probably no performance hit and maybe even some room for optimization (not done yet) - a split in the cache directory for luametatex so that we can more easilly experiment without interference (so, although the above works in mkiv it's not enabled there currently) - some minor things (also in the process of splitting the codebase) - no real changes in / additions to the luametatex binary (we're in cosmetics mode now) Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl -----------------------------------------------------------------
On Thu, 19 Aug 2021, Hans Hagen via ntg-context wrote:
Hi,
Here are the highlights of todays update:
- somewhat more compact tuc files, not for all documents, but it can accumulate; also less memory used then; i could bring down an extreme 2000 page 5 column doc tuc file down to 5% -- it was 70 MB; for the luametatex manual it reducec the tuc more than 30%; hard to tell if there will be an associated performance hit, but i'm sure thomas will complain if that's the case
I never realized that tuc files can grow so big. For big documents, would it make sense to simply read and write zipped tuc files? Aditya
On 8/19/2021 10:07 PM, Aditya Mahajan via ntg-context wrote:
On Thu, 19 Aug 2021, Hans Hagen via ntg-context wrote:
Hi,
Here are the highlights of todays update:
- somewhat more compact tuc files, not for all documents, but it can accumulate; also less memory used then; i could bring down an extreme 2000 page 5 column doc tuc file down to 5% -- it was 70 MB; for the luametatex manual it reducec the tuc more than 30%; hard to tell if there will be an associated performance hit, but i'm sure thomas will complain if that's the case
I never realized that tuc files can grow so big. For big documents, would it make sense to simply read and write zipped tuc files? normally they are not that large but when you enable for instance mechanisms that need positioning they can grow large .. zipping makes for less bytes but still large files and the overhead for serialization stays
(to some extend trying to make these things small is like compression but in a different way .. could be a nice topic for a ctx meeting) Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl -----------------------------------------------------------------
participants (2)
-
Aditya Mahajan
-
Hans Hagen