[NTG-context] Downloading long urls

Aditya Mahajan adityam at umich.edu
Fri Jan 21 18:15:33 CET 2011


On Sun, 16 Jan 2011, Aditya Mahajan wrote:

> Is there a robust way to avoid this problem? One possibility is that in 
> data-sch.lua instead of
>
>    local cleanname = gsub(original,"[^%a%d%.]+","-")
>
> use

     local cleanname = md5.HEX(original) -- gsub(original,"[^%a%d%.]+","-")

appears to work correctly in my tests. The drawback of this scheme is that 
instead of

    \externalfigure[url ending with .png]

one would have to use

    \externalfigure[url ending with .png][method=png]

But \input 'url ending with .tex' still works

The other drawback is the filenames in the cache will be gibberish. But on 
the plus side, you can use long urls.

Do you think that the drawbacks outweigh the gains?

I need this for the webfilter module, where the url can get pretty long. I 
can always write my own http_get function, but that will be mostly 
repetition of data-sch.lua

Aditya


More information about the ntg-context mailing list