[NTG-context] Using ConTeXt-LMTX for modern Mathematically-Literate-Programming 1/2
Stephen Gaito
stephen at perceptisys.co.uk
Thu Dec 3 18:08:29 CET 2020
Hans,
Again many thanks for your thoughts! (See below....)
On Thu, 3 Dec 2020 13:15:28 +0100
Hans Hagen <j.hagen at xs4all.nl> wrote:
> On 12/3/2020 12:15 PM, Taco Hoekwater wrote:
> >
> >
> >> On 3 Dec 2020, at 11:35, Stephen Gaito <stephen at perceptisys.co.uk>
> >> wrote:
> >>
> >> Hans,
> >>
> >> As I said my desktop is elderly... it has a 2.8GHz processor, 16Gb
> >> of DDR3 memory, and a couple of old SATA1 hard disks, and only 3Mb
> >> of CPU cache...
> >>
> >> ... all well past its use by date for single threaded ConTeXt. ;-(
> >>
> >> So one way to get better performance for ConTeXt is to invest in a
> >> new ultra fast processor. Which will cost a lot, and use a lot of
> >> power which has to be cooled, which uses even more power....
> >
> > Startup time can be improved quite a bit with an SSD. Even a cheap
> > SATA SSD is already much faster than a traditional harddisk.
> > Doesn’t help with longer documents, but it could be a fairly cheap
> > upgrade.
>
> also, an empty context run
>
> \starttext
> \stoptext
>
> only takes 0.490 seconds on my machine, which means:
>
> - starting mtxrun, which includes quite a bit of lua plus loading the
> file database etc
> - loading mtx-context that itself does some checking
> - and then launches the engine (it could be intgerated but then we
> run into issues when we have fatal errors as well as initializations
> so in the end it doesn't pay off at all)
> - the tex runs means: loading the format and initializing hundreds of
> lua scripts including all kind of unicode related stuff
>
> so, the .5 sec is quite acceptable to me and i knwo that when i would
> have amore recent machine it would go down to half of that
>
I will agree that this is acceptable for the complexity ConTeXt
represents... ConTeXt has a complex task... it *will* have to take some
time... that is OK.
> now, making a tex run persistent is not really a solution: one has to
> reset all kinds of counters, dimensions etc wipe node and token
> space, etc an done would also have to reset the pdf output which
> includes all kind of housekeeping states ... adding all kind of
> resetters and hooks for that (plus all the garbage collection needed)
> will never pay back and a 'wipe all and reload' is way more efficient
> then
I also agree, keeping a pool of "warm" running ConTeXts, as you are
essentially describing, would be nice... but I suspect the complexity
does preclude this approach. Keep it Simple... Simply killing and
restarting ConTeXt as a new process is OK.
>
> of course, when i ever run into a secenario where I have to creeate
> tens of thousands of one/few page docs very fast i might add some
> 'reset the pdf state' because that is kind of doable with some extra
> code but to be honest, no one ever came up with a project that had
> any real demands on the engine that could not be met (the fact that
> tex is a good solution for rendering doesn't mean that there is
> demand for it ... it is seldom on the radar of those who deal with
> that, who then often prefer some pdf library, also because quality
> doesn't really matter)
>
> these kind of performance things are demand driven (read: i need a
> pretty good reason to spend time on it)
Understood.
>
> > I can’t comment on how to speed up the rest of what you are doing,
> > but generally multi-threading TeX typesetting jobs is so hard as to
> > be impossible in practise. About the only step that can be split off
> > is the generation of the PDF, and even there the possible gain is
> > quite small (as you noticed already).
>
> indeed, see above
>
> > Typesetting is a compilation job, so the two main ways to speed
> > things along are
> >
> > 1) split the source into independent tasks, like in a code compiler
> > that splits code over separate .c / .cpp / .m / .p etc. files,
> > and then combine the results (using e.g. mutool)
> >
> > 2) precompile recurring stuff (in TeX, that would mean embedding
> > separately generated pdfs or images)
> right
>
> (and we are old enough and have been around long enough to have some
> gut feeling about that)
>
I have a deep respect for both your vision, and experience in this
matter.
However, way back when you started ConTeXt, very few people would
have said it was possible/worth embedding Lua in TeX....
Given that Lua *is* now embedded inside ConTeXt, I am simply making a
*crude* attempt to see if I can parallelize the overall ConTeXt
production cycle (with out changing ConTeXt-LMTX itself).
"Fools rush in..."
> Hans
>
> ps. When it comes to performance of tex, lua, context etc it is no
> problem, when googling a bit, to run into 'nonsense' arguments of why
> something is slow ... so don't take it for granted, just ask here on
> this list
>
I am not distracted by this noise... Complex things take time... what I
am attempting is complex... and requires the best tool available... and
ConTeXt is the *best* tool available! Many many thanks for your vision
and work!
> -----------------------------------------------------------------
> Hans Hagen | PRAGMA ADE
> Ridderstraat 27 | 8061 GH Hasselt | The Netherlands
> tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl
> -----------------------------------------------------------------
> ___________________________________________________________________________________
> If your question is of interest to others as well, please add an
> entry to the Wiki!
>
> maillist : ntg-context at ntg.nl /
> http://www.ntg.nl/mailman/listinfo/ntg-context webpage :
> http://www.pragma-ade.nl / http://context.aanhet.net archive :
> https://bitbucket.org/phg/context-mirror/commits/ wiki :
> http://contextgarden.net
> ___________________________________________________________________________________
More information about the ntg-context
mailing list