> I am working on creating a Domain Specific Language that will > solve partial differential equations. The user will enter a equation in > pseudo code format.
[Grammar cut] > It was decided to use Spark for the base for the compiler. To use Spark > all that is need is to just add a new class with the above BNF to parse > the equation. There is a good paper called "Charming the Python" that > was used as a template. The problem I am having is, how do I code the > Non-Terminals of the BNF. Hi Servando, This sort of stuff is usually broken down into two parts: 1. Lexing 2. Parsing and what might be confusing at first is that the word "parsing" is often casually used to mean both steps. At the moment, it looks like you're working on the lexing part. The nonterminals are not part of the lexical tokenizer, but part of the context free grammar. If it helps: lexers handle things that can be described by regular expressions, and parsers handle everything else. *grin* In Spark terminology, you'll want to write one of the ASTBuilders and define 'p_*' rules for each particular nonterminal. In the "Charming Python: Parsing in Python with Spark" paper you mentioned earlier, http://gnosis.cx/publish/programming/charming_python_b6.html take a look at the MarkupBuilder class: that's the part that handles non-terminals. Best of wishes! _______________________________________________ Tutor maillist - Tutor@python.org http://mail.python.org/mailman/listinfo/tutor