It happens in the token function in gram.c: 

    c = SkipSpace();
    if (c == '#') c = SkipComment();

and then SkipComment goes like that: 

static int SkipComment(void)
{
    int c;
    while ((c = xxgetc()) != '\n' && c != R_EOF) ;
    if (c == R_EOF) EndOfFile = 2;
    return c;
}

which effectively drops comments.

Would it be possible to keep the information somewhere ? 

The source code says this: 

 *  The function yylex() scans the input, breaking it into
 *  tokens which are then passed to the parser.  The lexical
 *  analyser maintains a symbol table (in a very messy fashion).

so my question is could we use this symbol table to keep track of, say, COMMENT 
tokens. 

Why would I even care about that ? I'm writing a package that will
perform syntax highlighting of R source code based on the output of the
parser, and it seems a waste to drop the comments. 

An also, when you print a function to the R console, you don't get the 
comments, and some of them might be useful to the user.

Am I mad if I contemplate looking into this ? 

Romain

-- 
Romain Francois
Independent R Consultant
+33(0) 6 28 91 30 30
http://romainfrancois.blog.free.fr




        [[alternative HTML version deleted]]

______________________________________________
R-devel@r-project.org mailing list
https://stat.ethz.ch/mailman/listinfo/r-devel

Reply via email to