Bernd, can you please say a little more? I think this list is ok to contain some description for commercial solutions that satisfy a request formulated on list.
Is there any product at BASIS Tech that provides a compound-analyzer with a big dictionary of decomposed compounds in German? If yes, for which domain? The Google Search result (I wonder if this is politically correct to not have yours ;-)) shows me that there's an amount of job done in this direction (e.g. Gärten to match Garten) but being precise for this question would be more helpful! paul Le 12 avr. 2012 à 12:46, Bernd Fehling a écrit : > > You might have a look at: > http://www.basistech.com/lucene/ > > > Am 12.04.2012 11:52, schrieb Michael Ludwig: >> Given an input of "Windjacke" (probably "wind jacket" in English), I'd >> like the code that prepares the data for the index (tokenizer etc) to >> understand that this is a "Jacke" ("jacket") so that a query for "Jacke" >> would include the "Windjacke" document in its result set. >> >> It appears to me that such an analysis requires a dictionary-backed >> approach, which doesn't have to be perfect at all; a list of the most >> common 2000 words would probably do the job and fulfil a criterion of >> reasonable usefulness. >> >> Do you know of any implementation techniques or working implementations >> to do this kind of lexical analysis for German language data? (Or other >> languages, for that matter?) What are they, where can I find them? >> >> I'm sure there is something out (commercial or free) because I've seen >> lots of engines grokking German and the way it builds words. >> >> Failing that, what are the proper terms do refer to these techniques so >> you can search more successfully? >> >> Michael