[Dev-luatex] Benchmark.

David Kastrup dak at gnu.org
Tue Apr 3 00:38:41 CEST 2007


Taco Hoekwater <taco at elvenkind.com> writes:

> Hi David,
>
> David Kastrup wrote:
>>> What is wrong with that tex.tex file is a mystery. I have not seen
>>> such slowness here and do not (yet) comprehend what is going on. Is
>>> there any particular part where it hesitates, or is it just overall
>>> much slower?
>>
>> No, just going slowly overall the way it looks, so it can't be
>> kpathsea, I guess.  The file is just generated by
>
> After some testing with a profiled binary, it turned out that LuaTeX
> spends nearly 90% of its run time inside the get_token() function
> when it is processing tex.tex completely (535 pages), but only
> 10% if it runs only the first 20 or so pages.
>
> Since get_token() is tex's internal version of malloc() more or less,
> I deduced that it was likely that there was an internal memory leak
> (unfreed node) that makes it harder for get_node() to find a new
> one when it is asked.
>
> Running a test file with \tracingstats=2 shows the variable memory
> usage gradually going up in both luatex and aleph, but not at all
> in pdftex, so the leak probably comes from omega. That makes the
> 'dir_node' the most likely suspect. More later.

Wow.  I do the first stupid thing that comes into my mind, and hit
upon some problem.

I am not sure that the normal effect of a memory leak would be to make
it "harder to find a new node": after all, when there are no free
nodes, allocation is fast.  And it can happen that repeatedly a large
node gets freed, a smaller node gets allocated from the large node,
and the next allocation of a large node has to look elsewhere.  Unless
adjacent small nodes get coalesced when freed, this could keep
allocating more memory without being able to use free memory.

I am afraid that I am too tired to dig into the allocation routines of
TeX right now.

-- 
David Kastrup, Kriemhildstr. 15, 44793 Bochum


More information about the dev-luatex mailing list