Can anyone connect context to R or python?
Hi All, I am very interested in creating pdf documents using Context with calculations done in R or Python and then added into the document. If you were using traditional latex this could be accomplished through R markdown or Sweave- is there any way to connect these to Context? Best, Claire
Am 2018-10-05 um 16:55 schrieb Kelley, Claire
Hi All, I am very interested in creating pdf documents using Context with calculations done in R or Python and then added into the document.
If you were using traditional latex this could be accomplished through R markdown or Sweave- is there any way to connect these to Context?
Hi Claire, you should use Aditya’s filter module to call external programs: https://github.com/adityam/filter There’s a module for R in the distribution (see https://source.contextgarden.net/tex/context/modules/mkii/m-r.mkii), but it’s only for MkII and AFAIK superseded by the filter module. Greetlings, Hraban --- https://www.fiee.net http://wiki.contextgarden.net https://www.dreiviertelhaus.de GPG Key ID 1C9B22FD
On 10/5/18 4:55 PM, Kelley, Claire wrote:
Hi All, I am very interested in creating pdf documents using Context with calculations done in R or Python and then added into the document.
Hi Claire, ConTeXt MkIV comes with Lua inside. Would Lua fit your purposes?
If you were using traditional latex this could be accomplished through R markdown or Sweave- is there any way to connect these to Context?
I’m afraid that my background is in humanities. If you need R, I wonder whether this can be easily done with ConTeXt. Just in case it helps, Pablo -- http://www.ousia.tk
Thank you for all the answers so far !
Ive gathered from that I need to use filter to call R externally.
What I am still most interested in is how I could get a single value from R to be part of the Context code. (Like you would do with \Sexpr{} in sweave. The use case for this is that I have some complicated tikz code that makes a fancy matrix - I want to fill it in with numbers that r reads from a csv file and processes.
On 10/5/18, 1:48 PM, "ntg-context on behalf of Pablo Rodriguez"
Lua
On 10/5/2018 8:00 PM, Kelley, Claire wrote:
Thank you for all the answers so far !
Ive gathered from that I need to use filter to call R externally.
What I am still most interested in is how I could get a single value from R to be part of the Context code. (Like you would do with \Sexpr{} in sweave. The use case for this is that I have some complicated tikz code that makes a fancy matrix - I want to fill it in with numbers that r reads from a csv file and processes. Can you give an example? Do you only need R for reading the csv?
Hans ntg-context@ntg.nl ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl -----------------------------------------------------------------
On Fri, 5 Oct 2018 20:07:27 +0200
Hans Hagen
I want to fill it in with numbers that r reads from a csv file and processes. Can you give an example? Do you only need R for reading the csv?
The processing is done by R. Of course, it *could* be done in Lua, but this all depends on what processing is needed, for R contains many libraries of routines (but so *can* lua). For the list, this discussion between Hans and myself follows a presentation that I made at the last ConTeXt Meeting regarding handling large data files in lua (+MetaPost). We are presently working on an article for the Context Journal to document this. Alan P.S. Lua is rather magical...
Again thank you everyone for their incredibly helpful advice and responsiveness!
The work flow I am envisioning is:
1) Use R to read in a csv file, filter according to some rules and then use propensity score matching (based on r packages) to create a model.
2) The final data output is a table with standardize coefficients and mean values.
3) This final data output is then used to create a table using Context/ Tikz to create the very specific format that I need for the final report.
I can do this using Sweave, but I need to be able to use context for some additional features (in particular the tagging/back end structure).
Best,
Claire
On 10/5/18, 2:07 PM, "Hans Hagen"
On 10/5/2018 8:00 PM, Kelley, Claire wrote:
Thank you for all the answers so far !
Ive gathered from that I need to use filter to call R externally.
What I am still most interested in is how I could get a single value from R to be part of the Context code. (Like you would do with \Sexpr{} in sweave. The use case for this is that I have some complicated tikz code that makes a fancy matrix - I want to fill it in with numbers that r reads from a csv file and processes. Can you give an example? Do you only need R for reading the csv?
Hans ntg-context@ntg.nl
----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl -----------------------------------------------------------------
On Sat, 6 Oct 2018, Kelley, Claire wrote:
Again thank you everyone for their incredibly helpful advice and responsiveness!
The work flow I am envisioning is:
1) Use R to read in a csv file, filter according to some rules and then use propensity score matching (based on r packages) to create a model. 2) The final data output is a table with standardize coefficients and mean values. 3) This final data output is then used to create a table using Context/ Tikz to create the very specific format that I need for the final report.
Is the final post-processed data a table that can be saved to a CSV file? If so, the simplest solution will be to write R code (inside a \startR ... \stopR environment) that does the post-processing and saves the data as a CSV file. Both ConTeXt and TikZ can easily read CSV table and format as desired. --- I also want to take this opportunity to express my views on intefacing with external programs. The file based interaction provided by the filter module is okay for small projects but it is not ideal. Slightly better is to use pipes (popen to a REPL) or use FFI (e.g https://adityam.github.io/context-blog/post/interfacing-with-julia), but neither of these is easy to implement and needs to be done on a per-language basis. Henri Menke had a Tugboat article on this as well. In my opinion, a better long-term option is to write a jupyter client in lua that can be called by context. Then we can easily interface with all languages that provide a jupyter kernel (https://github.com/jupyter/jupyter/wiki/Jupyter-kernels). The interface of a jupter-client is available here https://jupyter-client.readthedocs.io/en/stable/index.html. It seems relatively straight forward (send a JSON message and receive a JSON message). Translating the JSON messages to ConTeXt should also be easy. Is there anyone who wants to play around trying to implement this? Aditya
I also just got an incredibly helpful answer over on stack overflow : https://tex.stackexchange.com/questions/453868/how-can-i-use-context-and-r-t...
Which shows how to use Lua and filter together
Claire
On 10/6/18, 1:06 PM, "ntg-context on behalf of Aditya Mahajan"
On Sat, 6 Oct 2018, Kelley, Claire wrote:
Again thank you everyone for their incredibly helpful advice and responsiveness!
The work flow I am envisioning is:
1) Use R to read in a csv file, filter according to some rules and then use propensity score matching (based on r packages) to create a model. 2) The final data output is a table with standardize coefficients and mean values. 3) This final data output is then used to create a table using Context/ Tikz to create the very specific format that I need for the final report.
Is the final post-processed data a table that can be saved to a CSV file? If so, the simplest solution will be to write R code (inside a \startR ... \stopR environment) that does the post-processing and saves the data as a CSV file.
Both ConTeXt and TikZ can easily read CSV table and format as desired.
---
I also want to take this opportunity to express my views on intefacing with external programs. The file based interaction provided by the filter module is okay for small projects but it is not ideal. Slightly better is to use pipes (popen to a REPL) or use FFI (e.g https://adityam.github.io/context-blog/post/interfacing-with-julia), but neither of these is easy to implement and needs to be done on a per-language basis. Henri Menke had a Tugboat article on this as well.
In my opinion, a better long-term option is to write a jupyter client in lua that can be called by context. Then we can easily interface with all languages that provide a jupyter kernel (https://github.com/jupyter/jupyter/wiki/Jupyter-kernels).
The interface of a jupter-client is available here https://jupyter-client.readthedocs.io/en/stable/index.html. It seems relatively straight forward (send a JSON message and receive a JSON message). Translating the JSON messages to ConTeXt should also be easy. Is there anyone who wants to play around trying to implement this?
Aditya ___________________________________________________________________________________ If your question is of interest to others as well, please add an entry to the Wiki!
maillist : ntg-context@ntg.nl / http://www.ntg.nl/mailman/listinfo/ntg-context webpage : http://www.pragma-ade.nl / http://context.aanhet.net archive : https://bitbucket.org/phg/context-mirror/commits/ wiki : http://contextgarden.net ___________________________________________________________________________________
Aditya Mahajan schrieb am 06.10.18 um 19:06:
The interface of a jupter-client is available here https://jupyter-client.readthedocs.io/en/stable/index.html. It seems relatively straight forward (send a JSON message and receive a JSON message). Translating the JSON messages to ConTeXt should also be easy. Is there anyone who wants to play around trying to implement this?
ConTeXt can read JSON files, see util-jsn.lua Wolfgang
On Sat, 6 Oct 2018 13:06:18 -0400 (EDT)
Aditya Mahajan
In my opinion, a better long-term option is to write a jupyter client in lua that can be called by context. Then we can easily interface with all languages that provide a jupyter kernel (https://github.com/jupyter/jupyter/wiki/Jupyter-kernels).
The interface of a jupter-client is available here https://jupyter-client.readthedocs.io/en/stable/index.html. It seems relatively straight forward (send a JSON message and receive a JSON message). Translating the JSON messages to ConTeXt should also be easy. Is there anyone who wants to play around trying to implement this?
jupyter runs python code. Have you ever tried doing any real heavy data analysis using jupyter? My experience is that it chokes on large data sets... So why write lua code to call a jupyter kernel running python? Would it not make more sense developing code directly in lua in this case? The one thing that python (and jupyter) brings, or R for that matter, are libraries of calculation routines. These can be quite sophisticated, some efficient, and some not so efficient. My approach has always been to write my own routines or to adapt algorithms, at least then I know what the calculation is actually doing. Of course, this means that I spend time redoing what might have been done elsewhere, but the variety of routines that I actually use is rather small. Our experimentation with lua and with MetaPost is that one can achieve HUGE differences in efficiencies through efficient programming, factors of 2-3 or even orders of magnitude. (Hans can usually succeed in speeding-up my lua code.) Sometimes surprising things can make huge differences (never surprising once one understands what is happening). One can hope that python and R (and other) developers are efficient-minded programmers, but this is not always the case. Alan
On Sat, 6 Oct 2018, Alan Braslau wrote:
On Sat, 6 Oct 2018 13:06:18 -0400 (EDT) Aditya Mahajan
wrote: In my opinion, a better long-term option is to write a jupyter client in lua that can be called by context. Then we can easily interface with all languages that provide a jupyter kernel (https://github.com/jupyter/jupyter/wiki/Jupyter-kernels).
The interface of a jupter-client is available here https://jupyter-client.readthedocs.io/en/stable/index.html. It seems relatively straight forward (send a JSON message and receive a JSON message). Translating the JSON messages to ConTeXt should also be easy. Is there anyone who wants to play around trying to implement this?
jupyter runs python code.
Have you ever tried doing any real heavy data analysis using jupyter? My experience is that it chokes on large data sets... So why write lua code to call a jupyter kernel running python?
That's why I want to write a jupyter client in lua (so that there is no python code involved).
Would it not make more sense developing code directly in lua in this case?
Yes, but let me try to explain. When creating homework assignments for a course I teach, I often have documents as follows: \starttext Consider an LTI system with the transfer function \placefigure[eq:sys] \startformula H(s) = \frac{1}{s^2 + 2s + 2} \stopformula The step response of the system is shown in Figure \in[fig:plot]. Note that the step response settles to a final value of $0.5$ \startplacefigure [title={Step response of the LTI system described in \eqref[eq:sys]}] \externalfigure[step-response.pdf] \stopplacefigure \stoptext What I want to do is to be able to change the transfer function (given in the formula) and regenerate the plot. Something like the following: \defineLTIsystem[example][num={1}, den={1,2,2}] \starttext Consider an LTI system with the transfer function \placefigure[eq:sys] \startformula H(s) = \TF[example] \stopformula The step response of the system is shown in Figure \in[fig:plot]. Note that the step response settles to a final value of $\calculate{lim(s*TF[example], s, 0)}$. \startplacefigure [title={Step response of the LTI system described in \eqref[eq:sys]}] \STEP[example] \stopplacefigure \stoptext Now, it is possible to write the code to generate the step response in Lua/Metapost. But it quickly gets tiring and one essentially ends up creating a domain specific computational library in Lua. An alternative approach, is to use an existing library written in some other programming language (say Matlab or R or Julia or whatever). It is possible to do so using the `filter` module (plus some lua code). In this case, the user simply calls "context filename" and ConTeXt macros take care of calling an external program (say matlab) to generate the plot and do the algebraic calculations. Another approach which is taken by programs like Sweave and Knitr is to first run the program through R (or someother programming language). These are typically written for LaTeX. So code that is between \begin{Rcode} .. \end{Rcode} and \Rexp{...} (or something similar, haven't used R in a decade) is treated as R code and everything else is treated as comments. The evaluated file can then be run through `latex` or `context` or any typesetting program. The drawback of this approach is that not all programming languages have such a program. Now, what I want to do (at some stage) is to extend the functionality of the filter module to call jupyter kernels. So, instead of passing messages between context and the external program through text files, the messages can be passed as JSON objects (using sockets, I believe). The advantage is that you avoid multiple restarts of the external program (which is what the filter module currently does).
The one thing that python (and jupyter) brings, or R for that matter, are libraries of calculation routines. These can be quite sophisticated, some efficient, and some not so efficient. My approach has always been to write my own routines or to adapt algorithms, at least then I know what the calculation is actually doing. Of course, this means that I spend time redoing what might have been done elsewhere, but the variety of routines that I actually use is rather small.
If you have the time (and the expertise) then this is a good strategy. For me, this is not always the case. Aditya
Now, what I want to do (at some stage) is to extend the functionality of the filter module to call jupyter kernels. So, instead of passing messages between context and the external program through text files, the messages can be passed as JSON objects (using sockets, I believe). The advantage is that you avoid multiple restarts of the external program (which is what the filter module currently does). but that assumes that the listener is not restarting the a program (i.e.
On 10/6/2018 10:19 PM, Aditya Mahajan wrote: that e.g. r is listening and sending, right? Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | www.pragma-ade.nl | www.pragma-pod.nl -----------------------------------------------------------------
On 2018-10-06 02:06 PM, Aditya Mahajan wrote:
I also want to take this opportunity to express my views on intefacing with external programs. The file based interaction provided by the filter module is okay for small projects but it is not ideal. Slightly better is to use pipes (popen to a REPL) or use FFI (e.g https://adityam.github.io/context-blog/post/interfacing-with-julia), but neither of these is easy to implement and needs to be done on a per-language basis. Henri Menke had a Tugboat article on this as well.
In my opinion, a better long-term option is to write a jupyter client in lua that can be called by context. Then we can easily interface with all languages that provide a jupyter kernel (https://github.com/jupyter/jupyter/wiki/Jupyter-kernels).
The interface of a jupter-client is available here https://jupyter-client.readthedocs.io/en/stable/index.html. It seems relatively straight forward (send a JSON message and receive a JSON message). Translating the JSON messages to ConTeXt should also be easy. Is there anyone who wants to play around trying to implement this?
I have recently come across the SciLua project (http://scilua.org/index.html) which has a built-in LuaJIT client for Rserve (http://scilua.org/rclient.html) and may be relevant to the discussion. Although it certainly isn't as general as something like jupyter, leaning on SciLua may be an easier means of getting access to scientific computation with R and basic numerical methods. I am personally planning to look into switching from filter to SciLua if it means that I can get more efficient data transfer between R and ConTeXt. But I'm a ConTeXt and Lua neophyte so I doubt my personal efforts will be translatable into something more general like a module. Stan
On Mon, 8 Oct 2018 15:58:25 -0300
Stanislav Sokolenko
I have recently come across the SciLua project (http://scilua.org/index.html) which has a built-in LuaJIT client for Rserve (http://scilua.org/rclient.html) and may be relevant to the discussion. Although it certainly isn't as general as something like jupyter, leaning on SciLua may be an easier means of getting access to scientific computation with R and basic numerical methods. I am personally planning to look into switching from filter to SciLua if it means that I can get more efficient data transfer between R and ConTeXt. But I'm a ConTeXt and Lua neophyte so I doubt my personal efforts will be translatable into something more general like a module.
Stan
With some work, the scilua routines (which depend on OpenBLAS) could probably be made to work as a module under luatex with no need for an external client. Alan
participants (8)
-
Aditya Mahajan
-
Alan Braslau
-
Hans Hagen
-
Henning Hraban Ramm
-
Kelley, Claire
-
Pablo Rodriguez
-
Stanislav Sokolenko
-
Wolfgang Schuster