Yue Wang wrote:
And Getting rid of the web2c step and making debugging/extension much easier are the ultimate goals for the C version? I doubt not.
we need to keep a working engine so the luatex project has deliberately chosen a stepwise approach (btw, although nts resulted in a working version of tex, in practice it was way to slow to be useful; a rewrite of tex resulting in a variant many times slower that currently is not acceotable)
files. But in the TeX world things are quite another, All parts in the program are depended with each other, and one cannot understand some parts if he had not read the previous parts.
an other issue is that patching whatever bit of tex coude should be done very careful; this has been proven by adding a backend ... rather strict control over extensions and changes is needed in order to keep tex's reputation of stability up; a one line change could result in for instance a rounding issue (i mention it because we ran into it some time ago) and can have rather drastic consequences; you don't want that to happen with a machinery that has to reproduce a document at the pixel level a patch that might work on ones machine might eventually result in many problems all over the world if only because most users don't update frequently and depend on formal distributions (btw the same is true for fonts and other resources ... small changes can have huge consequences)
Moreover, as I have pointed out, TeX's data structure and memory management are not friendly to the newcomer. I start reading TeX: The
well, lucky us that 99.9% of the users works with tex at the macro level -)
eventually convert all the pascal web code into C in 2010 (2007 version, Taco changed the road map several times during the years. As I remember right, the first stable version would be released in 2007, then tabled in 2008, and now postponed to 2009) because I think many
well, we will probably change the roadmap a few more times; the 2007 and 2008 versions are rather stable, depending on what one does; keep in mind that we aim at pdftex compatibility which means that regular stuff (not using lua at all) should just work later in 2009 we will have another formal release, which opens up a bit more (and the reason for openin gup via lua is that users will use lua for extensions and not so much start patching the core engine written in pascal or c)
related algorithms. So I think the huge changing work is worthy (maybe we can call it metapost 2.0). In fact I don't think it will take too much amount of work as the rewriting can on the go whenever a part is
as said before, the idea is that one uses the lua interface to a rather abstract engine to extend tex; eentually theremight even be less and less core code
move out of the web file. The Java Implementation of TeX, NTS, takes about (2001-1998=) three years to release a beta which have all the structures changed. (Of course, It is interesting that all
actually, nts had still the same concepts as tex and was fully compatible which made the inner structures somewhat suboptimal
implementation of TeX except pdfTeX cannot become the mainstream TeX. Are compatibility the most important issue?).
indeed. compatibility has been a key concept of tex for over 30 years and will remain a key concept; pdftex became a succes because there was a rather strict control over releases (officially once per year, with code freezed months before tex live code freeze) and eventually luatex will end up the same Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | fax: 038 477 53 74 | www.pragma-ade.com | www.pragma-pod.nl -----------------------------------------------------------------