[NTG-context] Xml filtering in Lua

Thomas A. Schmitz thomas.schmitz at uni-bonn.de
Sun Nov 20 19:19:18 CET 2022

On 11/17/22 11:04, Hans Hagen via ntg-context wrote:
> so, basically you collect data and use it later ... for huge datasets 
> that saves some time
> if you have only chapters to process you can even decide to flush in 
> that function

Alright, I'm making very good progress here, but right now I'm stumbling 
upon a problem I can't solve. It's difficult to make a minimal example, 
so bear with some snippets.

I load data from an external xml file (not the one I'm processing) and 
store some of it in a lua table.

local examples = lxml.load ("my_examples", "examples.xml")
local sets     = lxml.load ("my_sets", "example_sets.xml")

for e in xml.collected (examples, "/examples/chapter/example") do
	local ex_id = e.at.id
	all_examples [ex_id] = e

This works as expected, with print (inspect (all_examples)), I can see 
that the table looks the way I expect.

I then retrieve some entries of the table by their key:

local current_example = all_examples [key]

Again, this appears to work; when I have a

lxml.displayverbatim (current_example)

in my file, the xml is typeset and looks like I would expect it to look. 
However, whatever I try, I get the serialized xml typeset, with all 
<tags> verbatim, instead of processed. Here's what I've tried:

\startxmlsetups xml:chapter:example
	\xmlfirst {#1} {.} \par

lxml.command (current_example, ".", "xml:chapter:example")


xml.sprint (lxml.id (current_example))


local problem = xml.text (lxml.id (current_example), "./[text()]")
xml.sprint (problem)

I was expecting at least the last version to retrieve the pure text, but 
it typesets again with the tags included.

So I guess my question is: how can I tell ConTeXt to parse my xml as xml 
and apply the proper setups instead of serializing it?

All best wishes


More information about the ntg-context mailing list