Hi! We’ve been working on adding LuaMetaTeX and ConTeXt standalone support for the Markdown package (https://github.com/Witiko/markdown/pull/557) and encountered an interesting issue. The package loads quite a huge library — `expl3-code.tex` from `l3kernel` package which eats a lot of time during compilation. What would be a good way to mitigate this? Is there a way to pre-load the library before compilation? Perhaps by making a custom ConTeXt format and making it with `context --make`? Unfortunately, I haven’t found a documentation on how to make your own custom formats, hence this question. Thank you in advance! P.S. What is the process of adding thrid-party modules in the default list of ConTeXt distribution (https://modules.contextgarden.net/)? I think Markdown package might proof very useful for a lot of folks and worth considering to be added in the list. What do you think?
On 7/27/2025 4:59 PM, andrei@borisov.dev wrote:
Hi!
We’ve been working on adding LuaMetaTeX and ConTeXt standalone support for the Markdown package (https://github.com/Witiko/markdown/pull/557 https://github.com/Witiko/markdown/pull/557) and encountered an interesting issue.
The package loads quite a huge library — `expl3-code.tex` from `l3kernel` package which eats a lot of time during compilation.
What would be a good way to mitigate this? Is there a way to pre-load the library before compilation? Perhaps by making a custom ConTeXt format and making it with `context --make`?
The likelyhood of interfering is too big to even consider it. Loading order matters and we also have some safeguards to consider. Calling it context would also be confusing and the context runner and ecosystem assumes certain names for formats and alike (also because we have a multi lingual interface).
Unfortunately, I haven’t found a documentation on how to make your own custom formats, hence this question.
There is no such option and there won't be. Quite some effort went into making the format overhead as small as possible so it kind of defeats the concept. Normally TeX is fast enough to load files. As a side note: generating the context formats takes < 2 seconds once the operating system has cached files, of course some more when a machine comes from sleep. Also, we have experimented with no format at all and even then the startup overhead on a run would be < 1.5 on a modern machine (for metapost the format - mem file - already has been dropped. Were talking of about 1000 tex and lua files. One main reason for keeping the formaty is that is still pays off especially on network shares, due to the number of files involved. So, if loading some latex code that you need takes too long then there is some issue with that code. Some time ago (when that performance problem was reported) I tested that and loading this expl code takes indeed some 9 seconds on my (somewhat older) machine but that is mostly because it loads unicode and backend code and maybe more that is of no use in the perspective of context which has all on board. So maybe that can be skipped? or maybe that code is just not intgrated well, after all, if we'd have such an issue in context, we'd use cached data (so it's more of a conceptual expl problem but it's not up to me to deal with that). Actually, one can wonder what expl3 bring to context at all; some intermediate layer like that just doesn't fit in. We try to make context as fast as posisble and adding some layer doesn't help. (And personally I realy dislike the look and feel of it and I'd dropped out of tex usage if I'd have to use that; but that is personal of course and therefore irrelevant). I suppose you need very little of that latex layer in supporting context, if at all, right? Do you need some extra helpers in context? So, can't you avoid loading the bottleneck expl code? Skip unicode stuff as a start? You probably then end up below a second. Boosting that code (by looking at it i might spot some) is not on my agenda.
Thank you in advance!
P.S. What is the process of adding thrid-party modules in the default list of ConTeXt distribution (https://modules.contextgarden.net/ https://modules.contextgarden.net/)? I think Markdown package might proof very useful for a lot of folks and worth considering to be added in the list. What do you think?
___________________________________________________________________________________ If your question is of interest to others as well, please add an entry to the Wiki!
maillist : ntg-context@ntg.nl / https://mailman.ntg.nl/mailman3/lists/ntg-context.ntg.nl webpage : https://www.pragma-ade.nl / https://context.aanhet.net (mirror) archive : https://github.com/contextgarden/context wiki : https://wiki.contextgarden.net ___________________________________________________________________________________
-- ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl -----------------------------------------------------------------
Am 27.07.25 um 16:59 schrieb andrei@borisov.dev:
The package loads quite a huge library — `expl3-code.tex` from `l3kernel` package which eats a lot of time during compilation.
What has expl3 to do with ConTeXt?
P.S. What is the process of adding thrid-party modules in the default list of ConTeXt distribution (https://modules.contextgarden.net/ https://modules.contextgarden.net/)? I think Markdown package might proof very useful for a lot of folks and worth considering to be added in the list. What do you think?
See https://wiki.contextgarden.net/ConTeXt_and_Lua_programming/Module_writing If it works and adheres to the necessary structure, I’ll add it to the list of installable modules. I don’t judge usefulness. I’d use Pandoc to convert Markdown to ConTeXt, if it’s just one-time, like: pandoc -f markdown -t context --template=mytemplate.tex pandoc-example.md > example.tex Otherwise, Aditya’s filter module covers most cases of inclusion of text-based formats: https://github.com/adityam/filter Hraban
On 7/27/2025 6:07 PM, Henning Hraban Ramm wrote:
Am 27.07.25 um 16:59 schrieb andrei@borisov.dev:
The package loads quite a huge library — `expl3-code.tex` from `l3kernel` package which eats a lot of time during compilation.
What has expl3 to do with ConTeXt?
nothing, it's something latex, as far as i understood (from some talks at bachotex) it's some layer that hides regular tex from the user (aside: so why use tex then anyway); kind of sounded / looked like 4th generation languages in the 80-90's, hiding the lower level language; but not suitable for my brain and mindset but that is of course personal the lua-tex-metapost combination in context is more 3rd gen but then also more fun esp after some more programming related primitives showed up in luametatex (and it works great with syntax highlighting as defined in context/scite mode); of course we do have some intermediate layer helpers but most is programmed in just tex-the-language (of course a related question then is: what do context and latex have in common anyway.. very little i guess) Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl -----------------------------------------------------------------
Am 27.07.25 um 21:15 schrieb Hans Hagen via ntg-context:
On 7/27/2025 6:07 PM, Henning Hraban Ramm wrote:
Am 27.07.25 um 16:59 schrieb andrei@borisov.dev:
The package loads quite a huge library — `expl3-code.tex` from `l3kernel` package which eats a lot of time during compilation.
What has expl3 to do with ConTeXt?
nothing, it's something latex,
I know, I’m using it (or avoid touching it) with LaTeX.
(of course a related question then is: what do context and latex have in common anyway.. very little i guess)
Yes, there’s an intersection that sometimes grows a bit (LaTeX people using more Lua), but mostly shrinks (LuaMetaTeX evolving). Hraban
Hi Hraban,
If it works and adheres to the necessary structure, I’ll add it to the list of installable modules. I don’t judge usefulness.
We should appreciate efforts so indeed no judgements (unless of course something doesn't fit into context or is overloading or conflicting with existing functionality but that is hard to determine and we have no time to check code anyway.)
I’d use Pandoc to convert Markdown to ConTeXt, if it’s just one-time, like: pandoc -f markdown -t context --template=mytemplate.tex pandoc- example.md > example.tex
And if it is one-time then I suppose you do some cleanup and adaptation? (Make me wonder, does converting to e.g. xml or docbook give a more rich starting point? After all we can handle that.)
Otherwise, Aditya’s filter module covers most cases of inclusion of text-based formats: https://github.com/adityam/filter
I found back some old m-markdown files (more a historic thing as i never was much into markdown apart from once speeding up and mem-fixing some third party lua code) so i'll add these to the distribution which then means that with the proposed new module users then have plenty possibilities. It reminds me that we need to follow up on the asciidoc presented at meetings in order to get the repertoire covered. Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl -----------------------------------------------------------------
Am 27.07.25 um 21:57 schrieb Hans Hagen via ntg-context:
Hi Hraban,
If it works and adheres to the necessary structure, I’ll add it to the list of installable modules. I don’t judge usefulness.
We should appreciate efforts so indeed no judgements (unless of course something doesn't fit into context or is overloading or conflicting with existing functionality but that is hard to determine and we have no time to check code anyway.)
I’d use Pandoc to convert Markdown to ConTeXt, if it’s just one-time, like: pandoc -f markdown -t context --template=mytemplate.tex pandoc- example.md > example.tex
And if it is one-time then I suppose you do some cleanup and adaptation?
Yes, because I like clean, semantic code, and that’s not what you get from an automatic conversion. But if you don’t need more than Markdown offers, then the Pandoc-ConTeXt route is simple and usable.
(Make me wonder, does converting to e.g. xml or docbook give a more rich starting point? After all we can handle that.)
I guess Götz would vote for Asciidoc as the best source format. I don’t think another conversion makes sense. If you have clean XML sources (e.g. HTML), then of course it makes sense to use them. But either they already contain all the necessary metadata (see Massi’s projects how difficult that can get), or you must convert to ConTeXt and edit. That’s what I usually do with my docx2ctx workflow. If I control and carefully edit the source (docx/ods) document, I don’t need to edit the tex code afterwards. At least _I_ can’t do that with directly processing the XML in one of these word processor formats, they need too much cleanup. Keith told us at the meeting in Lutten how he converts from ODS to ePub and processes the HTML from that with ConTeXt. (It’s also in the upcoming journal.) At least that’s easier than parsing office XML.
Otherwise, Aditya’s filter module covers most cases of inclusion of text-based formats: https://github.com/adityam/filter
I found back some old m-markdown files (more a historic thing as i never was much into markdown apart from once speeding up and mem-fixing some third party lua code) so i'll add these to the distribution which then means that with the proposed new module users then have plenty possibilities. It reminds me that we need to follow up on the asciidoc presented at meetings in order to get the repertoire covered.
I once had the markdown module in my list and wondered where it went. ;) Hraban
Hi Andrei, Hans, Hraban, On Sun, 2025-07-27 at 17:59 +0300, andrei@borisov.dev wrote:
We’ve been working on adding LuaMetaTeX and ConTeXt standalone support for the Markdown package (https://github.com/Witiko/markdown/pull/557) and encountered an interesting issue.
The package loads quite a huge library — `expl3-code.tex` from `l3kernel` package which eats a lot of time during compilation.
What would be a good way to mitigate this?
Some options: - Manually install v2.13.0 of the Markdown package, which doesn't require expl3. - Use pandoc: https://github.com/gucci-on-fleek/lua-widow-control/blob/b08ddbcd/docs/manua... - Convince Vit to rewrite the ConTeXt interface in Lua, similar to https://github.com/Witiko/markdown/issues/215#issuecomment-1359250887 - Use the builtin Markdown module, which was just updated in yesterday's release.
Is there a way to pre-load the library before compilation? Perhaps by making a custom ConTeXt format and making it with `context --make`?
Unfortunately, I haven’t found a documentation on how to make your own custom formats, hence this question.
This is a terrible idea, and you definitely should not recommend any of your users to do this, but if you're just looking for a hack to speed up your CI/testing, it is possible. The trick is to override one of ConTeXt's core files (which is why this is a terrible idea). "libs-ini.mkxl" is a good choice for this since it's loaded near the end and isn't used by most documents. So, make a file called "$TEXMFHOME/tex/context/third/libs-ini.mkxl" with the following contents: \directlua{function pdf.getcreationdate() end} % expl3 bug \usemodule[expl3-generic] and then you can run $ TEXMFHOME='{/PATH/TO/TEXLIVE/texmf-dist/tex/latex-dev/,/PATH/TO/TEXMFHOME/}' context --make You need to put "latex-dev" into the search path since only the prerelease expl3 versions currently work with ConTeXt. Again, I don't recommend this since it is very likely that a future ConTeXt update will break this, and building expl3 into the format might break core parts of ConTeXt, but it's certainly possible.
What is the process of adding thrid-party modules in the default list of ConTeXt distribution (https://modules.contextgarden.net/)?
It's fairly simple, you essentially just need to sign up for an account and then upload a zip file. However, the ConTeXt Standalone Distribution doesn't include expl3, so just adding the Markdown module alone probably wouldn't be very useful. On Sun, 2025-07-27 at 17:53 +0200, Hans Hagen via ntg-context wrote:
Actually, one can wonder what expl3 bring to context at all; some intermediate layer like that just doesn't fit in.
No one would ever write a ConTeXt-native module using expl3, but if you're using it for a LaTeX package anyways, then the ConTeXt support comes for "free" (from the developers side; you'll still pay for it in a much slower runtime).
So, can't you avoid loading the bottleneck expl code? Skip unicode stuff as a start? You probably then end up below a second. Boosting that code (by looking at it i might spot some) is not on my agenda.
Most of the Markdown module is written in Lua; only the TeX interface parts use expl3. I suspect that the easiest option would be to simply rewrite the TeX interface for ConTeXt using Lua. I actually tried doing this in March 2024 (and then got distracted and just switched to using Pandoc); the Markdown module has changed its interface since then, but it still works with TL23 and should be easy to port to the latest version. I've attached the file that I used, so feel free to use that as inspiration. On Sun, 2025-07-27 at 18:07 +0200, Henning Hraban Ramm wrote:
Am 27.07.25 um 16:59 schrieb andrei@borisov.dev:
The package loads quite a huge library — `expl3-code.tex` from `l3kernel` package which eats a lot of time during compilation.
What has expl3 to do with ConTeXt?
It's a generic programming layer for TeX, much like pgfkeys/pgfmath (the non-graphics parts of TikZ). It was originally designed for LaTeX, but its authors have put quite a bit of effort into making sure that it's usable in all formats, including ConTeXt. Thanks, -- Max local markdown = require "markdown" local headings = { "chapter", "section", "subsection", "subsubsection", "subsubsubsection", } local fmt = string.formatters local _writer = { ellipsis = [[[\dots]], code = [[\type{%s}]], space = [[\space]], hard_line_break = [[\crlf]], nbsp = [[\nobreakspace]], strong = [[\bold{%s}]], emphasis = [[\emph{%s}]], inline_html_tag = [[\type{%s}]], block_html_element = [[\type{%s}]], verbatim = [[\type{%s}]], thematic_break = [[\blackrule]], interblocksep = [[\par]], string = false, paragraph = false, plain = false, inline_html_comment = function() end, block_html_comment = function() end, document = [[\starttext %s \stoptext]], blockquote = [[\startquotation %s \stopquotation ]], link = function(label, url, title, attributes) return fmt[ [[\goto{%s}{url(%s)}]] ](label, url) end, image = function(label, url, title, attributes) return fmt[ [=[\externalfigure[%s]]=] ](url) end, bulletlist = fmt[[\startitemize %s \stopitemize ]], heading = function(content, level) return fmt[ [[\%s{%s}]] ](headings[level], content) end, } local writer = table.setmetatableindex({}, function(t, k) local func = _writer[k] if func == false then func = fmt["%s"] elseif type(func) == "string" then func = fmt[func .. "\n"] end return function(...) local args = { ... } for i, v in ipairs(args) do if type(v) == "table" then args[i] = table.concat(v) end end local out = func(table.unpack(args)) print(out) return out end end) local convert = markdown.reader.new(writer, { html = true, shiftHeadings = 1, eagerCache = false }).finalize_grammar({}) local out = convert[[ # Title Hello *World* ]] context.startext() context(out) context.stoptext()
On 7/29/2025 8:58 AM, Max Chernoff via ntg-context wrote:
Unfortunately, I haven’t found a documentation on how to make your own custom formats, hence this question.
This is a terrible idea, and you definitely should not recommend any of your users to do this, but if you're just looking for a hack to speed up your CI/testing, it is possible. The trick is to override one of ConTeXt's core files (which is why this is a terrible idea). "libs-ini.mkxl" is a good choice for this since it's loaded near the end and isn't used by most documents.
So, make a file called "$TEXMFHOME/tex/context/third/libs-ini.mkxl" with the following contents:
\directlua{function pdf.getcreationdate() end} % expl3 bug \usemodule[expl3-generic]
and then you can run
$ TEXMFHOME='{/PATH/TO/TEXLIVE/texmf-dist/tex/latex-dev/,/PATH/TO/TEXMFHOME/}' context --make
You need to put "latex-dev" into the search path since only the prerelease expl3 versions currently work with ConTeXt. Again, I don't recommend this since it is very likely that a future ConTeXt update will break this, and building expl3 into the format might break core parts of ConTeXt, but it's certainly possible.
a horror story indeed and that's why we also warn users on the console: warning > latex expl stuff code loaded, beware of side effects warning > latex unicode data code loaded, it makes no sense at all warning > latex backend code code loaded, this can badly interfere as you love hacking to the level of dangerous, here is an example of where all this stuff is getting weird \startluacode local t = tex.hashtokens() local l = { } for i=1,#t do if string.find(t[i],"mmm") then l[#l+1] = t[i] end end io.savedata("hash.txt",table.concat(l,"\n")) \stopluacode i know there is this "\romannumeral" fetish among some latex code writers (as i remember talks at tex meetings in the past) but this is where it really backfires; just a serialized number would be ok i guess: here it bloats string space, slows down csname lookups due to detokenization and such etc .. probably never used anyway but still ... and the whole unicode loading stuff that actually is a reason for load time is also sub optimal (from what i can see but i don't want to waste time on that as i can't really read expl3); fwiw, i once was in a mail exchange wrt loading that huge tounicode mapping thing in latex (side track of identifiying potential glyph names etc) and seconds load time could be brough back to nearly nothing by just coding a bit better but let's not enter a coding discussion here
No one would ever write a ConTeXt-native module using expl3, but if you're using it for a LaTeX package anyways, then the ConTeXt support comes for "free" (from the developers side; you'll still pay for it in a much slower runtime).
but that time penalty is then known anc accepted by the user, as logn as we don't get blamed for it (but as you mentioned, one should be aware of side effects, unicode, backend, color, whatever)
It's a generic programming layer for TeX, much like pgfkeys/pgfmath (the non-graphics parts of TikZ). It was originally designed for LaTeX, but its authors have put quite a bit of effort into making sure that it's usable in all formats, including ConTeXt.
ok, but then why load way more than a the basic layer (which probably already has enough potential issues, take register management and such but that can be dealt with i guess) Hans (we already have some protection kicking in but as you now start educating low level hacking/overload i need to protect even that i guess) ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl -----------------------------------------------------------------
participants (5)
-
andrei@borisov.dev -
Hans Hagen -
Hans Hagen -
Henning Hraban Ramm -
Max Chernoff