Taco (and Hans):
Of course there are a few things not implemented yet (in particular, we plan to have advanced support for in-line encoding/transliteration changes by making it possible to intercept the token builder, but that is not there yet).
ok, thank you. I've begun to make tests combining ocp's and lua. The tex file is: ============================= \documentclass{book} \def\lua#1{\directlua0\expandafter{\detokenize{#1}}} \ocp\luaocp=lua \ocplist\lualist=\addbeforeocplist 1 \luaocp \nullocplist \def\dolua#1{\lua{ print('(#1)'); tex.sprint('(#1)'); }} \begin{document} //{\pushocplist\lualist text and text}// \end{document} ============================= and the otp file (lua.otp) is: ============================= input: 1; output: 1; expressions: . => "\dolua{" \1 "}"; ============================== which just wraps every letter inside \dolua{ }. However, the letters are silently ignored (but "print" prints it in the console, so lua is working). I presume the problem is "tex.print" is like a line in the tex file, while ocp's are applied after expansion, so when \dolua is executed it's too late and the line is sent to nowhere. I'm aware the road map says the third stage will implement "token filtering (aka Translation Processes)". I'm just commenting on my tests, just in case they are useful. Javier PS. By the way, I'm using luatex on Windows XP with TeXLive 2005 and a modified latex.ltx.