I'm trying to get the SciTE lexers that come with the ConTeXt distribution working but I'm not having much luck. SciTE certainly processes the ConTeXt start-up script but never does any highlighting. I always get the warning that lpeg is not loaded but I cannot find any separate lpeg.properties file or lpeg.lua file so I don't know how to get it imported and active to run the lexers. Any suggestions? Interestingly, the lexers work just fine in Textadept but I haven't quite figured out how to get them to load automatically. I have to manually select the lexer but it works fine once I do. I'm inclined to go the Textadept route and build a nice snippets file and Adeptsense support but I'd like to see things running in SciTE, too. Right now I'm using Emacs to do my editing but AUCTeX does a poor job of supporting ConTeXt and when I asked about future plans on the AUCTeX mailing list, I got a snarky, sarcastic reply that "code doesn't change itself." Typical FSF attitude, I see. If I'm going to have to write my own support, Lua is much closer to the languages I've used before than elisp so it should be considerably easier. Of course, I'll pass on anything I come up with to the ConTeXt community. -- Bill Meahan K8QN "The pessimist complains about the wind; the optimist expects it to change; the realist adjusts the sails." -- William Arthur Ward This message is digitally signed with an X.509 certificate to prove it is from me and has not been altered since it was sent.
On 19-6-2012 06:54, Bill Meahan wrote:
I'm trying to get the SciTE lexers that come with the ConTeXt distribution working but I'm not having much luck. SciTE certainly processes the ConTeXt start-up script but never does any highlighting.
I always get the warning that lpeg is not loaded but I cannot find any separate lpeg.properties file or lpeg.lua file so I don't know how to get it imported and active to run the lexers. Any suggestions?
Interestingly, the lexers work just fine in Textadept but I haven't quite figured out how to get them to load automatically. I have to manually select the lexer but it works fine once I do.
I'm inclined to go the Textadept route and build a nice snippets file and Adeptsense support but I'd like to see things running in SciTE, too.
You have to change some configuration files but as that situation seems to change every now and then I wait with providing extra files for textadept till it's more stable. (Textadept has no realtime log pane so I cannot use it here without sacrificing too much convenience.)
Right now I'm using Emacs to do my editing but AUCTeX does a poor job of supporting ConTeXt and when I asked about future plans on the AUCTeX mailing list, I got a snarky, sarcastic reply that "code doesn't change itself." Typical FSF attitude, I see. If I'm going to have to write my own support, Lua is much closer to the languages I've used before than elisp so it should be considerably easier.
Of course, I'll pass on anything I come up with to the ConTeXt community.
concerning scite ... on windows it boils down to - installing scite - copying the scintillua lexer dir to the wscite path - copying the context/data/scite directory to teh wscite dir on linux it's a bit more complex as you need to figure out where scite keeps its properties files and that's sort of hard coded in the binary (but it does work on linux) on osx it's not working as there is no lpeg support provided there (unfortunately) Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | voip: 087 875 68 74 | www.pragma-ade.com | www.pragma-pod.nl -----------------------------------------------------------------
On 06/19/2012 04:58, Hans Hagen wrote:
On 19-6-2012 06:54, Bill Meahan wrote:
(Textadept has no realtime log pane so I cannot use it here without sacrificing too much convenience.)
Might be fixable, at least on *nix system. Textadept 5.4 does allow running processes in a subshell. I'll have to investigate a bit.
on linux it's a bit more complex as you need to figure out where scite keeps its properties files and that's sort of hard coded in the binary
(but it does work on linux)
Finding the properties files is pretty easy (/usr/local/share/scite/), Not sure where to put the lexers directory, though. I changed ScITEGlobal.properties per the instructions. There is no lpeg.properties file, nothing I've found (so far) in the global file to enable lpeg support. Took a scan of the source code and didn't see anything blindingly obvious. I'll keep searching.
on osx it's not working as there is no lpeg support provided there (unfortunately)
I'm on a FreeBSD-9.0 system. Since OSX is based on FreeBSD perhaps it is really there but it will take whatever I find out to make it active. I'll let everyone know what I find. Thanks for the response. -- Bill Meahan K8QN "The pessimist complains about the wind; the optimist expects it to change; the realist adjusts the sails." -- William Arthur Ward This message is digitally signed with an X.509 certificate to prove it is from me and has not been altered since it was sent.
On 19-6-2012 18:48, Bill Meahan wrote:
Finding the properties files is pretty easy (/usr/local/share/scite/), Not sure where to put the lexers directory, though. I changed ScITEGlobal.properties per the instructions. There is no lpeg.properties file, nothing I've found (so far) in the global file to enable lpeg support. Took a scan of the source code and didn't see anything blindingly obvious. I'll keep searching.
you need to fetch it from http://foicica.com/scintillua/ ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | voip: 087 875 68 74 | www.pragma-ade.com | www.pragma-pod.nl -----------------------------------------------------------------
On 06/19/2012 13:28, Hans Hagen wrote:
you need to fetch it from
I ended up installing the complete scintillua package and now everything works just fine. Had to modify the scintillua Makefile every place where there was a bare scintilla/<something> or scite/<something> to /usr/ports/editors/scite/work/scintilla/<something> or /usr/ports/editors/scite/work/scite/<something> but that was all. If you wish, you can add FreeBSD-9.0 as a platform it works on. Thanks for the suggestion. Now on to configuring Textadept to have Emacs-style keybindings, starting the ConTeXt lexers automatically, creating a snippets file and seeing if I can get a real-time output buffer going. -- Bill Meahan K8QN "The pessimist complains about the wind; the optimist expects it to change; the realist adjusts the sails." -- William Arthur Ward This message is digitally signed with an X.509 certificate to prove it is from me and has not been altered since it was sent.
On 19-6-2012 22:17, Bill Meahan wrote:
On 06/19/2012 13:28, Hans Hagen wrote:
you need to fetch it from
I ended up installing the complete scintillua package and now everything works just fine. Had to modify the scintillua Makefile every place where there was a bare scintilla/<something> or scite/<something> to /usr/ports/editors/scite/work/scintilla/<something> or /usr/ports/editors/scite/work/scite/<something> but that was all.
If you wish, you can add FreeBSD-9.0 as a platform it works on.
maybe we should have a repository in the garden anyhow, you can edit some info to the scit epage on the wiki
Thanks for the suggestion. Now on to configuring Textadept to have Emacs-style keybindings, starting the ConTeXt lexers automatically, creating a snippets file and seeing if I can get a real-time output buffer going.
and you can give spell checking a try (you can use mtxrun --script --scite to convert a list with words into one suitable for the context lexers) Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | voip: 087 875 68 74 | www.pragma-ade.com | www.pragma-pod.nl -----------------------------------------------------------------
On 06/19/2012 16:43, Hans Hagen wrote:
and you can give spell checking a try (you can use mtxrun --script --scite to convert a list with words into one suitable for the context lexers)
Hans
There is a little module in "Brian's ~/.textadept file" on the Textadept wiki that uses aspell to do spell-checking. I'm going to give that a try. I can also try using SCOWL plus the method you describe, SCOWL is the word-list from which aspell, ispell and hunspell dictionaries are built. -- Bill Meahan K8QN "The pessimist complains about the wind; the optimist expects it to change; the realist adjusts the sails." -- William Arthur Ward This message is digitally signed with an X.509 certificate to prove it is from me and has not been altered since it was sent.
On 19-6-2012 23:27, Bill Meahan wrote:
On 06/19/2012 16:43, Hans Hagen wrote:
and you can give spell checking a try (you can use mtxrun --script --scite to convert a list with words into one suitable for the context lexers)
Hans
There is a little module in "Brian's ~/.textadept file" on the Textadept wiki that uses aspell to do spell-checking. I'm going to give that a try. I can also try using SCOWL plus the method you describe, SCOWL is the word-list from which aspell, ispell and hunspell dictionaries are built.
fyi: the context lexers do realtime checking in combination with regular lexing Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | voip: 087 875 68 74 | www.pragma-ade.com | www.pragma-pod.nl -----------------------------------------------------------------
On 06/19/2012 17:57, Hans Hagen wrote:
On 19-6-2012 23:27, Bill Meahan wrote:
On 06/19/2012 16:43, Hans Hagen wrote:
and you can give spell checking a try (you can use mtxrun --script --scite to convert a list with words into one suitable for the context lexers)
Hans
fyi: the context lexers do realtime checking in combination with regular lexing
Hans
Tried this, ran into problems. 1. Running the command as shown generates a spell-xx.lua file but complains about a spell-xx.lucluac file not being present. 2. The lexer does not load the spell-xx.lua file but will find the spell-xx.txt file and try to use that. 3. English words are only found if the 'xx' is 'uk' while 'en' or 'us' or 'ca' (Canadian) are not found no matter if % language={en | us | ca} (one choice only, I'm just using shorthand here). uk is one particular form of English with particular spelling rules (e.g. 'our' instead of 'or' in words like 'honour' &al. There are also some terms proper in UK English that are not considered proper in US English and vice versa.) If all variants are not allowed, the generic 'en' ought to be used instead of 'uk' as it could be confusing to non-UK users. Ozzies and Kiwis may not mind but Yanks and Canucks do. :-) 4. Every 3-letter word is indicated as misspelled. This may be a side-effect of using the SCOWL word lists as they only deal with words > 3-letters (except for some acronyms). Hence common words like 'was' or 'who' or 'and' are marked as misspelled. 5. Only the first screen-full of text was color-coded. The rest of my document did not even have keyword highlighting. Turning off the spell-checking by removing the 'language=uk' line restored keyword highlighting throughout the whole file. SciTE 3.1.0 on FreeBSD-9.0-i386 The wordlist(s) tried were from SCOWL 7.1 at http://wordlist.sourceforge.net formed by cat en*80 | sort | uniq >spell-uk.txt Yes, that is a big wordlist but decades of doing crosswords has left me with a large vocabulary. ;-) -- Bill Meahan, Westland, Michigan USA “Writing is a combination of intangible creative fantasy and appallingly hard work.” —Anthony Powell This message is digitally signed with an X.509 certificate to prove it is from me and has not been altered since it was sent.
On 20-6-2012 18:37, Bill Meahan wrote:
1. Running the command as shown generates a spell-xx.lua file but complains about a spell-xx.lucluac file not being present.
luac compiles the lau file into bytecode (luac comes with lua and needs to match the lua version in scite)
2. The lexer does not load the spell-xx.lua file but will find the spell-xx.txt file and try to use that.
3. English words are only found if the 'xx' is 'uk' while 'en' or 'us' or 'ca' (Canadian) are not found no matter if % language={en | us | ca} (one choice only, I'm just using shorthand here). uk is one particular form of English with particular spelling rules (e.g. 'our' instead of 'or' in words like 'honour' &al. There are also some terms proper in UK English that are not considered proper in US English and vice versa.) If all variants are not allowed, the generic 'en' ought to be used instead of 'uk' as it could be confusing to non-UK users. Ozzies and Kiwis may not mind but Yanks and Canucks do. :-)
hm, do you have spell-ca and spell-en etc files? there is no hard coded uk check
4. Every 3-letter word is indicated as misspelled. This may be a side-effect of using the SCOWL word lists as they only deal with words > 3-letters (except for some acronyms). Hence common words like 'was' or 'who' or 'and' are marked as misspelled.
indeed, internally there is a limit of length 2 i can probably do a check for length i.e. if no 3 character words then also no checking
5. Only the first screen-full of text was color-coded. The rest of my document did not even have keyword highlighting. Turning off the spell-checking by removing the 'language=uk' line restored keyword highlighting throughout the whole file.
weird
Yes, that is a big wordlist but decades of doing crosswords has left me with a large vocabulary. ;-)
Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | voip: 087 875 68 74 | www.pragma-ade.com | www.pragma-pod.nl -----------------------------------------------------------------
On 20-6-2012 20:02, Hans Hagen wrote:
indeed, internally there is a limit of length 2
i can probably do a check for length i.e. if no 3 character words then also no checking
the next version has return { ["max"]=40, ["min"]=3, ["n"]=151493, ["words"]={ ["aardvark"]="aardvark", ["aardvarks"]="aardvarks", so in your case the 3 would be a 4 and that value is checked when coloring Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | voip: 087 875 68 74 | www.pragma-ade.com | www.pragma-pod.nl -----------------------------------------------------------------
On 06/20/2012 14:44, Hans Hagen wrote:
On 20-6-2012 20:02, Hans Hagen wrote:
indeed, internally there is a limit of length 2
i can probably do a check for length i.e. if no 3 character words then also no checking
the next version has
return { ["max"]=40, ["min"]=3, ["n"]=151493, ["words"]={ ["aardvark"]="aardvark", ["aardvarks"]="aardvarks",
so in your case the 3 would be a 4 and that value is checked when coloring
Hans
Cool! Many thanks. -- Bill Meahan, Westland, Michigan USA “Writing is a combination of intangible creative fantasy and appallingly hard work.” —Anthony Powell This message is digitally signed with an X.509 certificate to prove it is from me and has not been altered since it was sent.
On 06/20/2012 14:02, Hans Hagen wrote:
On 20-6-2012 18:37, Bill Meahan wrote:
1. Running the command as shown generates a spell-xx.lua file but complains about a spell-xx.lucluac file not being present.
luac compiles the lau file into bytecode (luac comes with lua and needs to match the lua version in scite)
I'll have to check the version in scite. I tried manually compiling with lua 5.1 with no success.
hm, do you have spell-ca and spell-en etc files? there is no hard coded uk check
I only tried it with spell-en.{txt | lua} - error message said no valid word list loaded. Made a hard link from spell-en.txt to spell-uk.txt and it found that. I manipulated the line 1 from % language=en to % language=uk to see the difference. -- Bill Meahan, Westland, Michigan USA “Writing is a combination of intangible creative fantasy and appallingly hard work.” —Anthony Powell This message is digitally signed with an X.509 certificate to prove it is from me and has not been altered since it was sent.
On 20-6-2012 21:05, Bill Meahan wrote:
On 06/20/2012 14:02, Hans Hagen wrote:
On 20-6-2012 18:37, Bill Meahan wrote:
1. Running the command as shown generates a spell-xx.lua file but complains about a spell-xx.lucluac file not being present.
luac compiles the lau file into bytecode (luac comes with lua and needs to match the lua version in scite)
I'll have to check the version in scite. I tried manually compiling with lua 5.1 with no success.
hm, do you have spell-ca and spell-en etc files? there is no hard coded uk check
I only tried it with spell-en.{txt | lua} - error message said no valid word list loaded. Made a hard link from spell-en.txt to spell-uk.txt and it found that. I manipulated the line 1 from % language=en to % language=uk to see the difference.
Forget about the txt file .. just the lua file will do (the luc file loads faster but it's neglectable on an edit session). I uploaded new scripts that generate a lua fiel with a bit more info and take care of the 4 char length (hopefully). Hans ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | voip: 087 875 68 74 | www.pragma-ade.com | www.pragma-pod.nl -----------------------------------------------------------------
On 06/19/2012 04:58, Hans Hagen wrote:
Interestingly, the lexers work just fine in Textadept but I haven't quite figured out how to get them to load automatically. I have to manually select the lexer but it works fine once I do.
Turns out they don't work as well as I thought they did. To start the context lexers automatically, it is only necessary to configure the mime_types.conf file to associate the .tex extension with the desired lexer. When I do that, however, Textadept suddenly stops displaying anything in the buffer! Even if you load something other than a .tex file. :-(
You have to change some configuration files but as that situation seems to change every now and then I wait with providing extra files for textadept till it's more stable.
(Textadept has no realtime log pane so I cannot use it here without sacrificing too much convenience.)
Have you looked at the "run" documentation? It appears from that output from whatever command is run is directed to a new buffer in real-time. That's what emacs does. Anybody got a ctags file for ConTeXt? :-) -- Bill Meahan K8QN "The pessimist complains about the wind; the optimist expects it to change; the realist adjusts the sails." -- William Arthur Ward This message is digitally signed with an X.509 certificate to prove it is from me and has not been altered since it was sent.
On 20-6-2012 05:04, Bill Meahan wrote:
Anybody got a ctags file for ConTeXt? :-)
what are ctags ----------------------------------------------------------------- Hans Hagen | PRAGMA ADE Ridderstraat 27 | 8061 GH Hasselt | The Netherlands tel: 038 477 53 69 | voip: 087 875 68 74 | www.pragma-ade.com | www.pragma-pod.nl -----------------------------------------------------------------
On Wed, Jun 20, 2012 at 4:24 PM, Hans Hagen
On 20-6-2012 05:04, Bill Meahan wrote:
Anybody got a ctags file for ConTeXt? :-)
what are ctags
CTAGS(1) Exuberant Ctags CTAGS(1) NAME ctags - Generate tag files for source code SYNOPSIS ctags [options] [file(s)] etags [options] [file(s)] DESCRIPTION The ctags and etags programs (hereinafter collectively referred to as ctags, except where distinguished) generate an index (or "tag") file for a variety of language objects found in file(s). This tag file allows these items to be quickly and easily located by a text editor or other utility. A "tag" signifies a language object for which an index entry is available (or, alternatively, the index entry created for that object). Alternatively, ctags can generate a cross reference file which lists, in human readable form, information about the various source objects found in a set of language files. Tag index files are supported by numerous editors, which allow the user to locate the object associated with a name appearing in a source file and jump to the file and line which defines the name. Those known about at the time of this release are: Vi(1) and its derivatives (e.g. Elvis, Vim, Vile, Lemmy), CRiSP, Emacs, FTE (Folding Text Editor), JED, jEdit, Mined, NEdit (Nirvana Edit), TSE (The SemWare Editor), UltraEdit, WorkSpace, X2, Zeus Ctags is capable of generating different kinds of tags for each of many different languages. For a complete list of supported languages, the names by which they are recognized, and the kinds of tags which are generated for each, see the --list-languages and --list-kinds options. -- luigi
On Wed, 20 Jun 2012, Hans Hagen wrote:
On 20-6-2012 05:04, Bill Meahan wrote:
Anybody got a ctags file for ConTeXt? :-)
what are ctags
They allow you to jump around to specific locations in a file. For example, in vim, pressing CTRL+] on "\in[sec:first]" will jump to the location where sec:first is defined. Basically, a ctags file contains tag and the file name and line number where those tags are defined. The editor then reads the ctags file and jumps to the appropriate location. A few years back, I had written a MkII module that generated ctags file when a tex file was compiled (attached). It was a hack that redefined a few low level macros. To provide proper support for ctags, we also need to store the current filename and line number for each reference in the tuc file. Then a module can read the tuc file and write the ctags file in an appropriate format. Aditya
On 06/20/2012 11:24, Aditya Mahajan wrote:
On Wed, 20 Jun 2012, Hans Hagen wrote:
On 20-6-2012 05:04, Bill Meahan wrote:
Anybody got a ctags file for ConTeXt? :-)
what are ctags
They allow you to jump around to specific locations in a file.
SciTE also uses them to generate auto-completion/insertion of language commands so a ctags file for ConTeXt itself would be useful. -- Bill Meahan, Westland, Michigan USA “Writing is a combination of intangible creative fantasy and appallingly hard work.” —Anthony Powell This message is digitally signed with an X.509 certificate to prove it is from me and has not been altered since it was sent.
participants (4)
-
Aditya Mahajan
-
Bill Meahan
-
Hans Hagen
-
luigi scarso